Data Importing : Importing the relevant libraries, could import basic libraries using one command using the pyforest library
import pyforest
import warnings
warnings.filterwarnings("ignore")
Data importing : Importing the raw bikeride csv files
data=pd.read_excel('tfl-daily-cycle-hires.xlsx',engine="openpyxl")
# Checking the head of the data
data.head()
| Day | Number of Bicycle Hires | Unnamed: 2 | Month | Number of Bicycle Hires.1 | Unnamed: 5 | Year | Number of Bicycle Hires.2 | Unnamed: 8 | Month.1 | Average Hire Time (mins) | Unnamed: 11 | Unnamed: 12 | Unnamed: 13 | Unnamed: 14 | Unnamed: 15 | Unnamed: 16 | Unnamed: 17 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | 6897 | NaN | 2010-07-01 | 12461.0 | NaN | 2010 | 2180813 | NaN | 2010-07-01 | 17.232566 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 1 | 2010-07-31 | 5564 | NaN | 2010-08-01 | 341203.0 | NaN | 2011 | 7142449 | NaN | 2010-08-01 | 16.551880 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2 | 2010-08-01 | 4303 | NaN | 2010-09-01 | 540859.0 | NaN | 2012 | 9519283 | NaN | 2010-09-01 | 15.219079 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 3 | 2010-08-02 | 6642 | NaN | 2010-10-01 | 544412.0 | NaN | 2013 | 8045459 | NaN | 2010-10-01 | 15.204481 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 4 | 2010-08-03 | 7966 | NaN | 2010-11-01 | 456304.0 | NaN | 2014 | 10023897 | NaN | 2010-11-01 | 13.776083 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
Data importing : Importing the raw power generation csv files
power_generation=pd.read_excel('power_generation.xlsx',engine="openpyxl")
power_generation['SD']=pd.to_datetime(power_generation['SD'], format='%Y-%m-%d')
power_generation.rename(columns={'SD':'Date'},inplace=True)
power_generation.head()
| Date | Gas | Coal | Nuclear | Hydro | Net Pumped | Wind | OCGT | Oil | Biomass | French Int | Dutch Int | NI Int | Eire Int | Nemo Int | Net Supply | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | 431920 | 184603 | 122467 | 4417 | -4582 | 4841 | 15 | 0 | 0 | 44898 | 0 | -7411 | 0 | 0 | 781168 |
| 1 | 2010-07-31 | 406077 | 111091 | 121983 | 4604 | 726 | 7013 | 0 | 0 | 0 | 46443 | 0 | -4932 | 0 | 0 | 693004 |
| 2 | 2010-08-01 | 393442 | 109041 | 126746 | 4839 | -6091 | 4264 | 0 | 0 | 0 | 47760 | 0 | -5775 | 0 | 0 | 674225 |
| 3 | 2010-08-02 | 429981 | 190693 | 122512 | 3638 | -2698 | 866 | 0 | 0 | 0 | 45391 | 0 | -7895 | 0 | 0 | 782488 |
| 4 | 2010-08-03 | 433955 | 182201 | 125603 | 3594 | -4137 | 5358 | 4 | 0 | 0 | 45788 | 0 | -7593 | 0 | 0 | 784771 |
Data importing : Importing the weather data for the dates in the data using an API from the website : https://www.worldweatheronline.com/. This API key is purchased for personal use. This request shouldnt be run as it charges the user per instance of weather calls.
from wwo_hist import retrieve_hist_data
frequency=24
start_date = '30-JUL-2010'
end_date = '28-FEB-2021'
api_key = '3bca1e937a********2211503' # The API key has been coded into asterisk for the sake of privacy.
location_list = ['London']
hist_weather_data = retrieve_hist_data(api_key,location_list,start_date,end_date,frequency,
location_label = False,export_csv = True,store_df = True)
Data importing : The weather data is saved as a .csv file in the directory as 'London.csv'.
Data cleaning : The data has been imported and cleaned
weatherdata=pd.read_csv('London.csv')
weatherdata['date_time']=pd.to_datetime(weatherdata['date_time'], format='%Y-%m-%d %H:%M:%S')
weatherdata['Dates']=weatherdata['date_time']
weatherdata['Dates'] = pd.to_datetime(weatherdata['Dates'])
weatherdata.rename(columns={'date_time':'Date'},inplace=True)
# Checking the head of the dataframe
weatherdata.head()
| Date | maxtempC | mintempC | totalSnow_cm | sunHour | uvIndex | moon_illumination | moonrise | moonset | sunrise | ... | cloudcover | humidity | precipMM | pressure | tempC | visibility | winddirDegree | windspeedKmph | location | Dates | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | 24 | 11 | 0.0 | 13.5 | 4 | 68 | 09:56 PM | 10:10 AM | 05:21 AM | ... | 39 | 70 | 0.0 | 1015 | 24 | 9 | 269 | 9 | London | 2010-07-30 |
| 1 | 2010-07-31 | 22 | 16 | 0.0 | 10.5 | 4 | 64 | 10:11 PM | 11:19 AM | 05:23 AM | ... | 74 | 81 | 2.0 | 1012 | 22 | 8 | 237 | 12 | London | 2010-07-31 |
| 2 | 2010-08-01 | 22 | 12 | 0.0 | 12.5 | 4 | 54 | 10:28 PM | 12:27 PM | 05:24 AM | ... | 57 | 77 | 2.9 | 1013 | 22 | 9 | 257 | 8 | London | 2010-08-01 |
| 3 | 2010-08-02 | 22 | 13 | 0.0 | 14.5 | 4 | 46 | 10:50 PM | 01:39 PM | 05:26 AM | ... | 33 | 75 | 7.3 | 1017 | 22 | 9 | 257 | 8 | London | 2010-08-02 |
| 4 | 2010-08-03 | 23 | 13 | 0.0 | 13.5 | 4 | 39 | 11:17 PM | 02:51 PM | 05:27 AM | ... | 37 | 77 | 0.1 | 1016 | 23 | 7 | 252 | 9 | London | 2010-08-03 |
5 rows × 26 columns
Data importing : Importing the Holidays data
holidays=pd.read_excel('uk_holidays.xlsx',engine="openpyxl")
# Checking the head of the dataframe
holidays.head()
| Date | Non_work_day | |
|---|---|---|
| 0 | 2010-01-01 | 1 |
| 1 | 2010-04-02 | 1 |
| 2 | 2010-04-05 | 1 |
| 3 | 2010-05-03 | 1 |
| 4 | 2010-05-31 | 1 |
Data cleaning : Filtering the dataframe on the features required for the analysis.
data=data[['Day', 'Number of Bicycle Hires', 'Month','Number of Bicycle Hires.1', 'Year',
'Number of Bicycle Hires.2', 'Month.1','Average Hire Time (mins)']]
# Checking the head of the dataframe
data.head()
| Day | Number of Bicycle Hires | Month | Number of Bicycle Hires.1 | Year | Number of Bicycle Hires.2 | Month.1 | Average Hire Time (mins) | |
|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | 6897 | 2010-07-01 | 12461.0 | 2010 | 2180813 | 2010-07-01 | 17.232566 |
| 1 | 2010-07-31 | 5564 | 2010-08-01 | 341203.0 | 2011 | 7142449 | 2010-08-01 | 16.551880 |
| 2 | 2010-08-01 | 4303 | 2010-09-01 | 540859.0 | 2012 | 9519283 | 2010-09-01 | 15.219079 |
| 3 | 2010-08-02 | 6642 | 2010-10-01 | 544412.0 | 2013 | 8045459 | 2010-10-01 | 15.204481 |
| 4 | 2010-08-03 | 7966 | 2010-11-01 | 456304.0 | 2014 | 10023897 | 2010-11-01 | 13.776083 |
Data cleaning : Cleaning and filtered data based on daily,yearly,monthly basis and other criterions
data_daily=data[['Day','Number of Bicycle Hires']]
data_monthly_hire_count=data[['Month','Number of Bicycle Hires.1']].head(128)
data_yearly_hires=data[['Year','Number of Bicycle Hires.2']].head(10)
data_average_time=data[['Month.1','Average Hire Time (mins)']].head(128)
percent_hires=data[['Year','Number of Bicycle Hires.2']][13:128]
Data cleaning : Renaming the columns
data_daily.rename(columns={'Day':'Date','Number of Bicycle Hires':'Number_of_Bicycle_Hires'},inplace=True)
data_yearly_hires.rename(columns={'Number of Bicycle Hires.2':'Bike_Hires_Yearly'},inplace=True)
data_average_time.rename(columns={'Month.1':'Month','Average Hire Time (mins)':'Average_Hire_Time_Average'},inplace=True)
data_monthly_hire_count.rename(columns={'Number of Bicycle Hires.1':'Bike_Hires_Monthly'},inplace=True)
percent_hires.rename(columns={'Number of Bicycle Hires.2':'Bike_Hires_Percent'},inplace=True)
Checking the data types of the features :
data_yearly_hires.dtypes
Year object Bike_Hires_Yearly object dtype: object
data_average_time.dtypes
Month datetime64[ns] Average_Hire_Time_Average float64 dtype: object
data_monthly_hire_count.dtypes
Month datetime64[ns] Bike_Hires_Monthly float64 dtype: object
percent_hires.dtypes
Year object Bike_Hires_Percent object dtype: object
Data Cleaning : Cleaning the data for Daily Bike Ride Data
data_daily=data_daily.merge(holidays, on='Date', how='outer')
data_daily=data_daily[:3867]
# data_daily['Workday'] = data_daily['Workday'].replace(np.nan, 0)
data_daily['Non_work_day'] = data_daily['Non_work_day'].replace(np.nan, 0)
data_daily.head()
| Date | Number_of_Bicycle_Hires | Non_work_day | |
|---|---|---|---|
| 0 | 2010-07-30 | 6897.0 | 0.0 |
| 1 | 2010-07-31 | 5564.0 | 0.0 |
| 2 | 2010-08-01 | 4303.0 | 0.0 |
| 3 | 2010-08-02 | 6642.0 | 0.0 |
| 4 | 2010-08-03 | 7966.0 | 0.0 |
data_daily['DAY'] = data_daily['Date'].dt.day_name()
data_daily['DAY_NUMBER']=data_daily['Date'].dt.dayofweek
data_daily.loc[data_daily.DAY == 'Sunday', 'WEEKDAY_WEEKEND'] = 'Weekend'
data_daily.loc[data_daily.DAY == 'Saturday', 'WEEKDAY_WEEKEND'] = 'Weekend'
data_daily.loc[data_daily.DAY == 'Monday', 'WEEKDAY_WEEKEND'] = 'Weekday'
data_daily.loc[data_daily.DAY == 'Tuesday', 'WEEKDAY_WEEKEND'] = 'Weekday'
data_daily.loc[data_daily.DAY == 'Wednesday', 'WEEKDAY_WEEKEND'] = 'Weekday'
data_daily.loc[data_daily.DAY == 'Thursday', 'WEEKDAY_WEEKEND'] = 'Weekday'
data_daily.loc[data_daily.DAY == 'Friday', 'WEEKDAY_WEEKEND'] = 'Weekday'
data_daily['ST_DAY']=np.nan
data_daily['ST_MONTH']=np.nan
data_daily['Month']=np.nan
data_daily['ST_YEAR']=np.nan
data_daily['Wkday_Wend']=np.nan
data_daily['ST_DAY'] = data_daily['Date'].dt.day
data_daily['ST_MONTH'] = data_daily['Date'].dt.month
data_daily['ST_YEAR'] = data_daily['Date'].dt.year
data_daily['DAY'] = data_daily['Date'].dt.day_name()
only_mondays=data_daily[data_daily['DAY']=='Monday']
only_tuesdays=data_daily[data_daily['DAY']=='Tuesday']
only_wednesdays=data_daily[data_daily['DAY']=='Wednesday']
only_thursdays=data_daily[data_daily['DAY']=='Thursday']
only_fridays=data_daily[data_daily['DAY']=='Friday']
only_saturdays=data_daily[data_daily['DAY']=='Saturday']
only_sundays=data_daily[data_daily['DAY']=='Sunday']
data_daily.loc[data_daily.ST_MONTH == 1, 'Month'] = 'January'
data_daily.loc[data_daily.ST_MONTH == 2, 'Month'] = 'February'
data_daily.loc[data_daily.ST_MONTH == 3, 'Month'] = 'March'
data_daily.loc[data_daily.ST_MONTH == 4, 'Month'] = 'April'
data_daily.loc[data_daily.ST_MONTH == 5, 'Month'] = 'May'
data_daily.loc[data_daily.ST_MONTH == 6, 'Month'] = 'June'
data_daily.loc[data_daily.ST_MONTH == 7, 'Month'] = 'July'
data_daily.loc[data_daily.ST_MONTH == 8, 'Month'] = 'August'
data_daily.loc[data_daily.ST_MONTH == 9, 'Month'] = 'September'
data_daily.loc[data_daily.ST_MONTH == 10, 'Month'] = 'October'
data_daily.loc[data_daily.ST_MONTH == 11, 'Month'] = 'November'
data_daily.loc[data_daily.ST_MONTH == 12, 'Month'] = 'December'
data_daily.loc[data_daily.WEEKDAY_WEEKEND == 'Weekday', 'Wkday_Wend'] = 1
data_daily.loc[data_daily.WEEKDAY_WEEKEND == 'Weekend', 'Wkday_Wend'] = 0
Data Cleaning : Filtering daily data based on features needed for the analysis.
data_daily=data_daily[['Date','DAY','DAY_NUMBER','WEEKDAY_WEEKEND','ST_DAY','ST_MONTH','Month','ST_YEAR',
'Non_work_day','Wkday_Wend','Number_of_Bicycle_Hires']]
data_daily.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | Number_of_Bicycle_Hires | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | 6897.0 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | 5564.0 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | 4303.0 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | 6642.0 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | 7966.0 |
Analysis : Mean rides over the years
sns.set(font_scale = 2);
yearly_analysis=pd.DataFrame(data_daily.groupby('ST_YEAR').mean()['Number_of_Bicycle_Hires'])
yearly_analysis.plot(kind='bar',color='salmon',edgecolor='k',figsize=(25,10))
plt.xlabel('Year');
plt.ylabel('Count of Rides');
plt.xticks(rotation=0, ha='right');
Analysis: Mean rides over the months
cats=['January','February','March','April','May','June','July','August','September','October','November','December']
sns.set(font_scale = 2);
monthly_analysis=pd.DataFrame(data_daily.groupby('Month').mean().reindex(cats)['Number_of_Bicycle_Hires'])
monthly_analysis.plot(kind='bar',color='salmon',edgecolor='k',figsize=(25,10))
plt.xlabel('Year');
plt.ylabel('Count of Rides');
plt.xticks(rotation=45, ha='right');
Analysis: Mean rides based on Weekdays and Weekends
sns.set(font_scale = 2);
wkday_wkend_analysis=pd.DataFrame(data_daily.groupby('WEEKDAY_WEEKEND').mean()['Number_of_Bicycle_Hires'])
wkday_wkend_analysis.plot(kind='bar',color='salmon',edgecolor='k',figsize=(25,10))
plt.xlabel('Year');
plt.ylabel('Count of Rides');
plt.xticks(rotation=0, ha='right');
Analysis: Mean rides over the day of the week
cats = [ 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday', 'Sunday']
sns.set(font_scale = 2);
daily_analysis=pd.DataFrame(data_daily.groupby('DAY').mean().reindex(cats)['Number_of_Bicycle_Hires'])
daily_analysis.plot(kind='bar',color='salmon',edgecolor='k',figsize=(25,10))
plt.xlabel('Year');
plt.ylabel('Count of Rides');
plt.xticks(rotation=0, ha='right');
Analysis: Bike rides over the month compared on the basis of Weekdays and Weekends
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
months=['January','February','March','April','May','June','July','August','September','October',
'November','December']
sns.barplot(x="Month", y="Number_of_Bicycle_Hires",hue="WEEKDAY_WEEKEND", data=data_daily,color='salmon',
edgecolor='k',errwidth=0,order=months);
plt.xlabel('Month');
plt.ylabel('Count of Rides');
Analysis : Distribution of Rides
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.hist(data_daily['Number_of_Bicycle_Hires'], bins = 30, color='salmon',edgecolor='k');
plt.xlabel('Rides')
plt.ylabel('Counts');
Analysis: Bike rides over the week compared on the basis of the months of the year
days = [ 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday', 'Sunday']
months=['January','February','March','April','May','June','July','August','September','October',
'November','December']
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
sns.barplot(x="DAY", y="Number_of_Bicycle_Hires",hue='Month', data=data_daily, capsize=.2,
errwidth=0,order=days,hue_order=months)
plt.xlabel('Day of the Week')
plt.ylabel('Number of Bike Hires');
plt.legend(title='Months of the year',loc='upper center', bbox_to_anchor=(0.5, 1.25),ncol=6,
fancybox=True, shadow=True);
Machine Learning has been used in the next section to predict the number of bike rides. Data-Sets used for the prediction are
1. On TFL data
2. On TFL data merged with weather data and holiday data
3. On TFL data merged with Power Generation data and holiday data
4. On TFL data merged with weather data, Power Generation data and holiday data
As this analysis deals with a regression problem, 5 different machine learning algorithms have been used for this analysis
1. Linear Regression
2. Decision Tree Regressor
3. Random Forest Regressor
4. ADA-Boost Regressor
5. Neural Network
Finally ARIMA and SEASONAL ARIMA (SARIMA) has been used for the prediction of Bike Rides.
The evaluation metrics of the model are enumerated at the end. The metrics considered for this analysis are Mean Absolute Error, Mean Squared Error, Root Mean Squared Error and Mean Absolute Percentage Error. The reason for choosing these is because they are industry standard metrics used to evaluate performance of models. Finally a scatter plot has also been shown as a metric for each machine learning model which lets the viewer identify the efficacy of the model. The closeness of the scatter between the test value vs the predicted value and y=x line shows how efficient the model is.
One Hot Encoding for Time and Weekday/Weekend Feature
all_data = pd.concat([data_daily, pd.get_dummies(data_daily.ST_YEAR, prefix = 'Time')],axis = 1)
# Checking the head of the dataframe
all_data.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | ... | Time_2012 | Time_2013 | Time_2014 | Time_2015 | Time_2016 | Time_2017 | Time_2018 | Time_2019 | Time_2020 | Time_2021 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 23 columns
all_data.columns
Index(['Date', 'DAY', 'DAY_NUMBER', 'WEEKDAY_WEEKEND', 'ST_DAY', 'ST_MONTH',
'Month', 'ST_YEAR', 'Non_work_day', 'Wkday_Wend',
'Number_of_Bicycle_Hires', 'Time_2010', 'Time_2011', 'Time_2012',
'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016', 'Time_2017',
'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021'],
dtype='object')
The data has been split into a training set and a testing set. The model will be trained on the training set and then the test set will be used to evaluate the model.
X = all_data[['ST_DAY', 'ST_MONTH','Time_2010', 'Time_2011',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend']]
y = all_data['Number_of_Bicycle_Hires']
Linear Regression
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
from sklearn.linear_model import LinearRegression
lm = LinearRegression()
lm.fit(X_train,y_train)
predictions = lm.predict(X_test)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
# Metrics for Evaluation
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
MAE: 6448.552431100257 MSE: 66507327.454023644
RMSE: 8155.202477806645
MAPE: 31.57510671903528
Decision Tree
from sklearn.tree import DecisionTreeRegressor
dt = DecisionTreeRegressor(random_state=0)
dt_params = {'max_depth':np.arange(1,50,2),'min_samples_leaf':np.arange(2,15)}
from sklearn.model_selection import GridSearchCV
gs_dt = GridSearchCV(dt,dt_params,cv=3)
gs_dt.fit(X_train,y_train)
a = gs_dt.best_params_
# Training with best parameters
# from sklearn.tree import DecisionTreeRegressor
dtr=DecisionTreeRegressor(max_depth=a['max_depth'],min_samples_leaf= a['min_samples_leaf'])
model = dtr.fit(X_train,y_train)
predictions = model.predict(X_test)
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
MAE: 4069.329561329943 MSE: 30027910.148318723
RMSE: 5479.772819042659
MAPE: 21.21644582439306
Random Forest
# Finding best parameters for RandomForestRegressor
from sklearn.ensemble import RandomForestRegressor
rf = RandomForestRegressor(random_state=0)
rf_params = {'n_estimators':np.arange(25,150,25),'max_depth':np.arange(1,11,2),
'min_samples_leaf':np.arange(2,15,3)}
from sklearn.model_selection import GridSearchCV
gs_rf = GridSearchCV(rf,rf_params,cv=3)
gs_rf.fit(X_train,y_train)
b = gs_rf.best_params_
RF = RandomForestRegressor(n_estimators=b['n_estimators'],max_depth=b['max_depth'],
min_samples_leaf=b['min_samples_leaf'],random_state=0)
model = RF.fit(X_train,y_train)
pred = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, pred))
print('MSE:', metrics.mean_squared_error(y_test, pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4000.821519814423 MSE: 29681972.944895517
RMSE: 5448.116458455666
MAPE: 21.09387952802081
ADA Boost Regressor
from sklearn.ensemble import AdaBoostRegressor
ar = AdaBoostRegressor(base_estimator=RF,random_state=0)
ar_params = {'n_estimators':np.arange(25,200,25)}
from sklearn.model_selection import GridSearchCV
gs_ar = GridSearchCV(ar,ar_params,cv=3)
gs_ar.fit(X_train,y_train)
c = gs_ar.best_params_
# Fitting the model with best params
ab_rf = AdaBoostRegressor(base_estimator=RF,n_estimators=c['n_estimators'],random_state=0)
model = ab_rf.fit(X_train,y_train)
y_pred = model.predict(X_test);
print('MAE:', metrics.mean_absolute_error(y_test, y_pred))
print('MSE:', metrics.mean_squared_error(y_test, y_pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,y_pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
y_pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,y_pred,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4045.366969967796 MSE: 27243208.142824523
RMSE: 5219.50267198174
MAPE: 20.20873845089842
Neural Network
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation
from tensorflow.keras.optimizers import Adam
X = all_data[['ST_DAY', 'ST_MONTH','Time_2010', 'Time_2011',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend']]
y = all_data['Number_of_Bicycle_Hires']
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
model = Sequential()
model.add(Dense(50,activation='relu'))
model.add(Dense(25,activation='relu'))
model.add(Dense(20,activation='relu'))
model.add(Dense(10,activation='relu'))
model.add(Dense(1))
model.compile(optimizer='adam',loss='mse')
from tensorflow.keras.callbacks import EarlyStopping
early_stop = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=25)
model.fit(x=X_train.values,y=y_train.values,
validation_data=(X_test.values,y_test.values),batch_size=128,epochs=400,callbacks=[early_stop]);
Train on 2320 samples, validate on 1547 samples Epoch 1/400 2320/2320 [==============================] - 0s 156us/step - loss: 764021107.2000 - val_loss: 751432519.9017 Epoch 2/400 2320/2320 [==============================] - 0s 16us/step - loss: 763877349.5172 - val_loss: 751376242.5546 Epoch 3/400 2320/2320 [==============================] - 0s 15us/step - loss: 763825315.3103 - val_loss: 751319437.1558 Epoch 4/400 2320/2320 [==============================] - 0s 13us/step - loss: 763731640.4966 - val_loss: 751161539.1028 Epoch 5/400 2320/2320 [==============================] - 0s 16us/step - loss: 763456367.2276 - val_loss: 750697264.6930 Epoch 6/400 2320/2320 [==============================] - 0s 12us/step - loss: 762647331.7517 - val_loss: 749336065.8617 Epoch 7/400 2320/2320 [==============================] - 0s 15us/step - loss: 760337055.3379 - val_loss: 745490674.7201 Epoch 8/400 2320/2320 [==============================] - 0s 15us/step - loss: 753871754.5931 - val_loss: 735249956.9851 Epoch 9/400 2320/2320 [==============================] - 0s 15us/step - loss: 737945524.9655 - val_loss: 711679257.2359 Epoch 10/400 2320/2320 [==============================] - 0s 13us/step - loss: 703924205.9034 - val_loss: 664416213.0989 Epoch 11/400 2320/2320 [==============================] - 0s 15us/step - loss: 640996856.9379 - val_loss: 582342558.7382 Epoch 12/400 2320/2320 [==============================] - 0s 17us/step - loss: 538357185.5448 - val_loss: 462146171.6975 Epoch 13/400 2320/2320 [==============================] - 0s 14us/step - loss: 402706900.7448 - val_loss: 320334841.7117 Epoch 14/400 2320/2320 [==============================] - 0s 15us/step - loss: 267787913.4897 - val_loss: 208825245.6781 Epoch 15/400 2320/2320 [==============================] - 0s 16us/step - loss: 187895426.4276 - val_loss: 168768229.8332 Epoch 16/400 2320/2320 [==============================] - 0s 13us/step - loss: 172074175.7793 - val_loss: 165301001.7944 Epoch 17/400 2320/2320 [==============================] - 0s 12us/step - loss: 169992380.3586 - val_loss: 163391515.6716 Epoch 18/400 2320/2320 [==============================] - 0s 16us/step - loss: 167913320.7172 - val_loss: 161421309.1248 Epoch 19/400 2320/2320 [==============================] - 0s 14us/step - loss: 166028301.1862 - val_loss: 159572489.4118 Epoch 20/400 2320/2320 [==============================] - 0s 15us/step - loss: 164267247.1172 - val_loss: 157794896.2069 Epoch 21/400 2320/2320 [==============================] - 0s 16us/step - loss: 162480440.2759 - val_loss: 156080096.7395 Epoch 22/400 2320/2320 [==============================] - 0s 15us/step - loss: 160741784.9379 - val_loss: 154260930.3736 Epoch 23/400 2320/2320 [==============================] - 0s 16us/step - loss: 158964026.7034 - val_loss: 152394882.8752 Epoch 24/400 2320/2320 [==============================] - 0s 11us/step - loss: 157183448.3862 - val_loss: 150680442.8442 Epoch 25/400 2320/2320 [==============================] - 0s 15us/step - loss: 155434376.0000 - val_loss: 148855028.2560 Epoch 26/400 2320/2320 [==============================] - 0s 16us/step - loss: 153839886.7862 - val_loss: 147104930.3529 Epoch 27/400 2320/2320 [==============================] - 0s 16us/step - loss: 152176495.2276 - val_loss: 145598568.9360 Epoch 28/400 2320/2320 [==============================] - 0s 16us/step - loss: 150528535.6138 - val_loss: 143799837.1403 Epoch 29/400 2320/2320 [==============================] - 0s 13us/step - loss: 148970311.1724 - val_loss: 142305783.8914 Epoch 30/400 2320/2320 [==============================] - 0s 13us/step - loss: 147420051.5310 - val_loss: 140673192.7240 Epoch 31/400 2320/2320 [==============================] - 0s 15us/step - loss: 145905680.3310 - val_loss: 139176342.7692 Epoch 32/400 2320/2320 [==============================] - 0s 15us/step - loss: 144393211.1448 - val_loss: 137579827.9819 Epoch 33/400 2320/2320 [==============================] - 0s 13us/step - loss: 143060902.1793 - val_loss: 136089268.3439 Epoch 34/400 2320/2320 [==============================] - 0s 15us/step - loss: 141592423.1724 - val_loss: 134975271.1054 Epoch 35/400 2320/2320 [==============================] - 0s 14us/step - loss: 140180191.7793 - val_loss: 133264592.8739 Epoch 36/400 2320/2320 [==============================] - 0s 16us/step - loss: 138814039.8345 - val_loss: 131850098.8752 Epoch 37/400 2320/2320 [==============================] - 0s 16us/step - loss: 137541911.3931 - val_loss: 130989431.0356 Epoch 38/400 2320/2320 [==============================] - 0s 12us/step - loss: 136235786.1517 - val_loss: 129205144.7835 Epoch 39/400 2320/2320 [==============================] - 0s 16us/step - loss: 134769481.9310 - val_loss: 127870460.1810 Epoch 40/400 2320/2320 [==============================] - 0s 16us/step - loss: 133466816.8828 - val_loss: 126504508.3594 Epoch 41/400 2320/2320 [==============================] - 0s 16us/step - loss: 132211747.9724 - val_loss: 125288050.9968 Epoch 42/400 2320/2320 [==============================] - 0s 14us/step - loss: 130956009.4897 - val_loss: 123948427.6975 Epoch 43/400 2320/2320 [==============================] - 0s 13us/step - loss: 129667485.7379 - val_loss: 122561220.4370 Epoch 44/400 2320/2320 [==============================] - 0s 13us/step - loss: 128345477.5172 - val_loss: 121489480.1913 Epoch 45/400 2320/2320 [==============================] - 0s 13us/step - loss: 127204418.2069 - val_loss: 120304385.1454 Epoch 46/400 2320/2320 [==============================] - 0s 14us/step - loss: 126032083.3103 - val_loss: 118825637.3963 Epoch 47/400 2320/2320 [==============================] - 0s 13us/step - loss: 124784429.2414 - val_loss: 117750460.5145 Epoch 48/400 2320/2320 [==============================] - 0s 13us/step - loss: 123969565.1862 - val_loss: 116741222.0633 Epoch 49/400 2320/2320 [==============================] - 0s 14us/step - loss: 122419031.6138 - val_loss: 115226851.1157 Epoch 50/400 2320/2320 [==============================] - 0s 13us/step - loss: 121305970.0966 - val_loss: 114094735.0950 Epoch 51/400 2320/2320 [==============================] - 0s 13us/step - loss: 120027659.6966 - val_loss: 112774656.1707 Epoch 52/400 2320/2320 [==============================] - 0s 14us/step - loss: 119113904.6069 - val_loss: 111621137.0498 Epoch 53/400 2320/2320 [==============================] - 0s 14us/step - loss: 117679094.9517 - val_loss: 110566780.0724 Epoch 54/400 2320/2320 [==============================] - 0s 14us/step - loss: 116421454.6207 - val_loss: 109210319.0924 Epoch 55/400 2320/2320 [==============================] - 0s 15us/step - loss: 115278949.7379 - val_loss: 107920877.8798 Epoch 56/400 2320/2320 [==============================] - 0s 14us/step - loss: 114042896.1103 - val_loss: 106813283.7906 Epoch 57/400 2320/2320 [==============================] - 0s 14us/step - loss: 113340586.2621 - val_loss: 105659276.4215 Epoch 58/400 2320/2320 [==============================] - 0s 14us/step - loss: 111659877.2966 - val_loss: 104213650.8933 Epoch 59/400 2320/2320 [==============================] - 0s 14us/step - loss: 110548184.2759 - val_loss: 103071101.6781 Epoch 60/400 2320/2320 [==============================] - 0s 13us/step - loss: 109352215.2828 - val_loss: 102009345.2540 Epoch 61/400 2320/2320 [==============================] - 0s 13us/step - loss: 107811700.4138 - val_loss: 100423291.6690 Epoch 62/400 2320/2320 [==============================] - 0s 13us/step - loss: 106594423.5034 - val_loss: 99145438.7020 Epoch 63/400 2320/2320 [==============================] - 0s 14us/step - loss: 105336275.6414 - val_loss: 98141977.3135 Epoch 64/400 2320/2320 [==============================] - 0s 14us/step - loss: 103966282.2621 - val_loss: 96515846.3788 Epoch 65/400 2320/2320 [==============================] - 0s 14us/step - loss: 102728614.0690 - val_loss: 95406704.3568 Epoch 66/400 2320/2320 [==============================] - 0s 14us/step - loss: 101515860.6345 - val_loss: 93991623.6613 Epoch 67/400 2320/2320 [==============================] - 0s 13us/step - loss: 100300685.3517 - val_loss: 92893129.0730 Epoch 68/400 2320/2320 [==============================] - 0s 14us/step - loss: 98916646.4000 - val_loss: 91510148.0621 Epoch 69/400 2320/2320 [==============================] - 0s 14us/step - loss: 97624086.7310 - val_loss: 90454812.4163 Epoch 70/400 2320/2320 [==============================] - 0s 14us/step - loss: 96433248.1655 - val_loss: 89095486.5546 Epoch 71/400 2320/2320 [==============================] - 0s 14us/step - loss: 95050492.6897 - val_loss: 87755758.5417 Epoch 72/400 2320/2320 [==============================] - 0s 14us/step - loss: 93791022.8414 - val_loss: 86471376.8326 Epoch 73/400 2320/2320 [==============================] - 0s 13us/step - loss: 92521355.8621 - val_loss: 85199629.1506 Epoch 74/400 2320/2320 [==============================] - 0s 13us/step - loss: 91376345.1586 - val_loss: 84060114.3323 Epoch 75/400 2320/2320 [==============================] - 0s 13us/step - loss: 90181226.3724 - val_loss: 83038321.1092 Epoch 76/400 2320/2320 [==============================] - 0s 13us/step - loss: 89071633.8207 - val_loss: 81746992.4964 Epoch 77/400 2320/2320 [==============================] - 0s 13us/step - loss: 87869605.2966 - val_loss: 80732145.1946 Epoch 78/400 2320/2320 [==============================] - 0s 14us/step - loss: 86799012.7724 - val_loss: 79533936.7886 Epoch 79/400 2320/2320 [==============================] - 0s 13us/step - loss: 85736043.6966 - val_loss: 78723941.0394 Epoch 80/400 2320/2320 [==============================] - 0s 14us/step - loss: 84869904.0000 - val_loss: 77872606.3555 Epoch 81/400 2320/2320 [==============================] - 0s 13us/step - loss: 83804585.6000 - val_loss: 76789984.5714 Epoch 82/400 2320/2320 [==============================] - 0s 13us/step - loss: 82871309.9586 - val_loss: 76203348.7240 Epoch 83/400 2320/2320 [==============================] - 0s 14us/step - loss: 82538729.6000 - val_loss: 75358714.2831 Epoch 84/400 2320/2320 [==============================] - 0s 13us/step - loss: 81588241.2138 - val_loss: 74594224.3180 Epoch 85/400 2320/2320 [==============================] - 0s 14us/step - loss: 80763188.4138 - val_loss: 73999016.2948 Epoch 86/400 2320/2320 [==============================] - 0s 14us/step - loss: 80274850.0966 - val_loss: 73468326.7201 Epoch 87/400 2320/2320 [==============================] - 0s 14us/step - loss: 79523759.1724 - val_loss: 73026101.9134 Epoch 88/400 2320/2320 [==============================] - 0s 14us/step - loss: 78866538.0414 - val_loss: 72381708.8197 Epoch 89/400 2320/2320 [==============================] - 0s 14us/step - loss: 78387056.9103 - val_loss: 71984127.0511 Epoch 90/400 2320/2320 [==============================] - 0s 14us/step - loss: 78035346.9241 - val_loss: 71651323.6044 Epoch 91/400 2320/2320 [==============================] - 0s 14us/step - loss: 77719989.9034 - val_loss: 71465558.7253 Epoch 92/400 2320/2320 [==============================] - 0s 14us/step - loss: 77200513.1586 - val_loss: 71093430.7666 Epoch 93/400 2320/2320 [==============================] - 0s 14us/step - loss: 76946713.6000 - val_loss: 70975004.4654 Epoch 94/400 2320/2320 [==============================] - 0s 13us/step - loss: 76646443.6690 - val_loss: 70469739.9069 Epoch 95/400 2320/2320 [==============================] - 0s 13us/step - loss: 76447803.0897 - val_loss: 70318820.4059 Epoch 96/400 2320/2320 [==============================] - 0s 14us/step - loss: 76098527.4207 - val_loss: 70119828.0259 Epoch 97/400 2320/2320 [==============================] - 0s 13us/step - loss: 75939618.2069 - val_loss: 69992002.2650 Epoch 98/400 2320/2320 [==============================] - 0s 13us/step - loss: 75742619.9172 - val_loss: 69815730.1694 Epoch 99/400 2320/2320 [==============================] - 0s 13us/step - loss: 75515581.9034 - val_loss: 69595157.4867 Epoch 100/400 2320/2320 [==============================] - 0s 14us/step - loss: 75500828.0276 - val_loss: 69473674.8649 Epoch 101/400 2320/2320 [==============================] - 0s 13us/step - loss: 75411305.1586 - val_loss: 69372365.8384 Epoch 102/400 2320/2320 [==============================] - 0s 14us/step - loss: 75089550.5655 - val_loss: 69201496.0931 Epoch 103/400 2320/2320 [==============================] - 0s 14us/step - loss: 75033781.4621 - val_loss: 69221047.9224 Epoch 104/400 2320/2320 [==============================] - 0s 14us/step - loss: 74835015.3103 - val_loss: 69036688.3051 Epoch 105/400 2320/2320 [==============================] - 0s 13us/step - loss: 74655899.9172 - val_loss: 68902998.6451 Epoch 106/400 2320/2320 [==============================] - 0s 14us/step - loss: 74518220.1931 - val_loss: 68874420.6438 Epoch 107/400 2320/2320 [==============================] - 0s 13us/step - loss: 74384447.2828 - val_loss: 68704046.3969 Epoch 108/400 2320/2320 [==============================] - 0s 13us/step - loss: 74272036.3034 - val_loss: 68551253.5695 Epoch 109/400 2320/2320 [==============================] - 0s 14us/step - loss: 74334526.7586 - val_loss: 68488136.7136 Epoch 110/400 2320/2320 [==============================] - 0s 14us/step - loss: 74057273.4897 - val_loss: 68401710.8933 Epoch 111/400 2320/2320 [==============================] - 0s 14us/step - loss: 73916871.8897 - val_loss: 68309072.4396 Epoch 112/400 2320/2320 [==============================] - 0s 13us/step - loss: 73743683.5862 - val_loss: 68072456.0155 Epoch 113/400 2320/2320 [==============================] - 0s 15us/step - loss: 73740853.3517 - val_loss: 68265296.5740 Epoch 114/400 2320/2320 [==============================] - 0s 14us/step - loss: 73642620.6897 - val_loss: 67882027.5992 Epoch 115/400 2320/2320 [==============================] - 0s 15us/step - loss: 73462166.0138 - val_loss: 67918841.9858 Epoch 116/400 2320/2320 [==============================] - 0s 24us/step - loss: 73380125.2414 - val_loss: 67829914.5029 Epoch 117/400 2320/2320 [==============================] - 0s 15us/step - loss: 73308932.1379 - val_loss: 67530431.8345 Epoch 118/400 2320/2320 [==============================] - 0s 15us/step - loss: 73092200.1379 - val_loss: 67564516.0595 Epoch 119/400 2320/2320 [==============================] - 0s 17us/step - loss: 73075773.1586 - val_loss: 67457253.1299 Epoch 120/400 2320/2320 [==============================] - 0s 12us/step - loss: 72819739.5310 - val_loss: 67283889.1015 Epoch 121/400 2320/2320 [==============================] - 0s 12us/step - loss: 72786267.9172 - val_loss: 67125888.9360 Epoch 122/400 2320/2320 [==============================] - 0s 11us/step - loss: 72690403.8345 - val_loss: 67192445.0627 Epoch 123/400 2320/2320 [==============================] - 0s 11us/step - loss: 72516595.6414 - val_loss: 67019338.9011 Epoch 124/400 2320/2320 [==============================] - 0s 11us/step - loss: 72588891.4759 - val_loss: 66870967.0899 Epoch 125/400 2320/2320 [==============================] - 0s 11us/step - loss: 72271546.9241 - val_loss: 66703310.0452 Epoch 126/400 2320/2320 [==============================] - 0s 11us/step - loss: 72388427.8621 - val_loss: 66742108.8455 Epoch 127/400 2320/2320 [==============================] - 0s 13us/step - loss: 71974453.6276 - val_loss: 66439491.6716 Epoch 128/400 2320/2320 [==============================] - 0s 16us/step - loss: 71981039.4483 - val_loss: 66340211.5217 Epoch 129/400 2320/2320 [==============================] - 0s 15us/step - loss: 71762046.4552 - val_loss: 66223576.1189 Epoch 130/400 2320/2320 [==============================] - 0s 14us/step - loss: 71634244.3034 - val_loss: 66224509.9315 Epoch 131/400 2320/2320 [==============================] - 0s 13us/step - loss: 71679429.9586 - val_loss: 66004045.5591 Epoch 132/400 2320/2320 [==============================] - 0s 14us/step - loss: 71291061.9034 - val_loss: 66064523.9302 Epoch 133/400 2320/2320 [==============================] - 0s 14us/step - loss: 71314932.6345 - val_loss: 65836983.7052 Epoch 134/400 2320/2320 [==============================] - 0s 14us/step - loss: 71377916.9655 - val_loss: 65659695.9741 Epoch 135/400 2320/2320 [==============================] - 0s 14us/step - loss: 71204647.9310 - val_loss: 65701695.2812 Epoch 136/400 2320/2320 [==============================] - 0s 13us/step - loss: 70972850.2345 - val_loss: 65369215.5656 Epoch 137/400 2320/2320 [==============================] - 0s 14us/step - loss: 70783699.9724 - val_loss: 65391774.8830 Epoch 138/400 2320/2320 [==============================] - 0s 13us/step - loss: 70529321.4345 - val_loss: 65195593.6858 Epoch 139/400 2320/2320 [==============================] - 0s 13us/step - loss: 70358958.3448 - val_loss: 65041974.1745 Epoch 140/400 2320/2320 [==============================] - 0s 14us/step - loss: 70243969.0759 - val_loss: 64933995.8992 Epoch 141/400 2320/2320 [==============================] - 0s 14us/step - loss: 70061558.1793 - val_loss: 64788572.4835 Epoch 142/400 2320/2320 [==============================] - 0s 14us/step - loss: 69884551.0621 - val_loss: 64610705.1946 Epoch 143/400 2320/2320 [==============================] - 0s 14us/step - loss: 69749928.4966 - val_loss: 64439919.2243 Epoch 144/400 2320/2320 [==============================] - 0s 14us/step - loss: 69551476.1379 - val_loss: 64271427.8474 Epoch 145/400 2320/2320 [==============================] - 0s 14us/step - loss: 69739388.8000 - val_loss: 64257665.4480 Epoch 146/400 2320/2320 [==============================] - 0s 15us/step - loss: 69227328.4414 - val_loss: 63918446.6658 Epoch 147/400 2320/2320 [==============================] - 0s 14us/step - loss: 68991812.8000 - val_loss: 63842176.4913 Epoch 148/400 2320/2320 [==============================] - 0s 14us/step - loss: 68919726.5931 - val_loss: 63677604.3232 Epoch 149/400 2320/2320 [==============================] - 0s 14us/step - loss: 68666644.4690 - val_loss: 63529329.7324 Epoch 150/400 2320/2320 [==============================] - 0s 14us/step - loss: 68478057.4345 - val_loss: 63205612.0026 Epoch 151/400 2320/2320 [==============================] - 0s 14us/step - loss: 68342470.3448 - val_loss: 63133366.5133 Epoch 152/400 2320/2320 [==============================] - 0s 14us/step - loss: 68015667.7517 - val_loss: 62830932.1319 Epoch 153/400 2320/2320 [==============================] - 0s 14us/step - loss: 67833930.4828 - val_loss: 62668486.8830 Epoch 154/400 2320/2320 [==============================] - 0s 13us/step - loss: 67637381.4621 - val_loss: 62482473.8617 Epoch 155/400 2320/2320 [==============================] - 0s 14us/step - loss: 67507001.9034 - val_loss: 62255368.8662 Epoch 156/400 2320/2320 [==============================] - 0s 13us/step - loss: 67157763.0897 - val_loss: 62017739.4441 Epoch 157/400 2320/2320 [==============================] - 0s 14us/step - loss: 66911928.1103 - val_loss: 61866221.1144 Epoch 158/400 2320/2320 [==============================] - 0s 14us/step - loss: 66691825.8759 - val_loss: 61603959.6587 Epoch 159/400 2320/2320 [==============================] - 0s 14us/step - loss: 66431232.4414 - val_loss: 61464984.2896 Epoch 160/400 2320/2320 [==============================] - 0s 15us/step - loss: 66199637.5172 - val_loss: 61107298.1202 Epoch 161/400 2320/2320 [==============================] - 0s 15us/step - loss: 65907675.4483 - val_loss: 60935298.0659 Epoch 162/400 2320/2320 [==============================] - 0s 15us/step - loss: 65587123.1448 - val_loss: 60593870.8365 Epoch 163/400 2320/2320 [==============================] - 0s 15us/step - loss: 65343935.4483 - val_loss: 60326054.5107 Epoch 164/400 2320/2320 [==============================] - 0s 13us/step - loss: 65057357.7655 - val_loss: 60055675.0200 Epoch 165/400 2320/2320 [==============================] - 0s 14us/step - loss: 64765767.8069 - val_loss: 60040881.7427 Epoch 166/400 2320/2320 [==============================] - 0s 13us/step - loss: 64465673.3241 - val_loss: 59522725.9987 Epoch 167/400 2320/2320 [==============================] - 0s 14us/step - loss: 64119643.9172 - val_loss: 59247513.3859 Epoch 168/400 2320/2320 [==============================] - 0s 14us/step - loss: 63806405.6828 - val_loss: 58895203.0511 Epoch 169/400 2320/2320 [==============================] - 0s 13us/step - loss: 63383757.1310 - val_loss: 58591977.0187 Epoch 170/400 2320/2320 [==============================] - 0s 14us/step - loss: 63096660.0828 - val_loss: 58254441.6858 Epoch 171/400 2320/2320 [==============================] - 0s 15us/step - loss: 62807790.6207 - val_loss: 58242197.7505 Epoch 172/400 2320/2320 [==============================] - 0s 14us/step - loss: 62310674.4276 - val_loss: 57562514.6736 Epoch 173/400 2320/2320 [==============================] - 0s 15us/step - loss: 62103621.1862 - val_loss: 57209323.0097 Epoch 174/400 2320/2320 [==============================] - 0s 14us/step - loss: 61912212.9379 - val_loss: 56956033.6445 Epoch 175/400 2320/2320 [==============================] - 0s 14us/step - loss: 61221660.4414 - val_loss: 56512613.1661 Epoch 176/400 2320/2320 [==============================] - 0s 14us/step - loss: 60828296.6621 - val_loss: 56127972.4732 Epoch 177/400 2320/2320 [==============================] - 0s 13us/step - loss: 60447829.5172 - val_loss: 55886825.1635 Epoch 178/400 2320/2320 [==============================] - 0s 13us/step - loss: 60067271.3931 - val_loss: 55462212.1939 Epoch 179/400 2320/2320 [==============================] - 0s 13us/step - loss: 59799196.6897 - val_loss: 54976186.4202 Epoch 180/400 2320/2320 [==============================] - 0s 14us/step - loss: 59250368.0000 - val_loss: 54569319.9173 Epoch 181/400 2320/2320 [==============================] - 0s 13us/step - loss: 58886781.8621 - val_loss: 54504112.0621 Epoch 182/400 2320/2320 [==============================] - 0s 13us/step - loss: 58362390.1793 - val_loss: 53789617.3859 Epoch 183/400 2320/2320 [==============================] - 0s 13us/step - loss: 57880783.2276 - val_loss: 53351657.0860 Epoch 184/400 2320/2320 [==============================] - 0s 14us/step - loss: 57474604.3310 - val_loss: 52940226.6270 Epoch 185/400 2320/2320 [==============================] - 0s 14us/step - loss: 56878591.3379 - val_loss: 52468290.5029 Epoch 186/400 2320/2320 [==============================] - 0s 13us/step - loss: 56438930.9793 - val_loss: 52084680.0052 Epoch 187/400 2320/2320 [==============================] - 0s 12us/step - loss: 55982554.9793 - val_loss: 51591397.4971 Epoch 188/400 2320/2320 [==============================] - 0s 13us/step - loss: 55455216.4414 - val_loss: 51099838.3866 Epoch 189/400 2320/2320 [==============================] - 0s 13us/step - loss: 55168144.0000 - val_loss: 50653171.9043 Epoch 190/400 2320/2320 [==============================] - 0s 14us/step - loss: 54568466.4828 - val_loss: 50202469.0834 Epoch 191/400 2320/2320 [==============================] - 0s 14us/step - loss: 54023146.2621 - val_loss: 49976520.0052 Epoch 192/400 2320/2320 [==============================] - 0s 13us/step - loss: 53645249.9310 - val_loss: 49323318.9451 Epoch 193/400 2320/2320 [==============================] - 0s 13us/step - loss: 53182917.7379 - val_loss: 48887315.1338 Epoch 194/400 2320/2320 [==============================] - 0s 13us/step - loss: 52614984.8828 - val_loss: 48559861.4971 Epoch 195/400 2320/2320 [==============================] - 0s 13us/step - loss: 52122514.6483 - val_loss: 48157194.3219 Epoch 196/400 2320/2320 [==============================] - 0s 13us/step - loss: 51646910.0690 - val_loss: 47538630.4796 Epoch 197/400 2320/2320 [==============================] - 0s 14us/step - loss: 51162812.9517 - val_loss: 47092815.2915 Epoch 198/400 2320/2320 [==============================] - 0s 14us/step - loss: 50642447.7793 - val_loss: 46698195.6354 Epoch 199/400 2320/2320 [==============================] - 0s 14us/step - loss: 50218089.1310 - val_loss: 46215637.2437 Epoch 200/400 2320/2320 [==============================] - 0s 14us/step - loss: 49597288.4690 - val_loss: 45732918.1952 Epoch 201/400 2320/2320 [==============================] - 0s 14us/step - loss: 49274572.5241 - val_loss: 45459401.8720 Epoch 202/400 2320/2320 [==============================] - 0s 14us/step - loss: 48859354.8690 - val_loss: 45052948.7576 Epoch 203/400 2320/2320 [==============================] - 0s 13us/step - loss: 48244615.4207 - val_loss: 44518622.0194 Epoch 204/400 2320/2320 [==============================] - 0s 11us/step - loss: 47887917.0207 - val_loss: 44325928.2069 Epoch 205/400 2320/2320 [==============================] - 0s 11us/step - loss: 47424058.5931 - val_loss: 43779523.6975 Epoch 206/400 2320/2320 [==============================] - 0s 11us/step - loss: 47073213.2966 - val_loss: 43849610.9218 Epoch 207/400 2320/2320 [==============================] - 0s 11us/step - loss: 46975050.2621 - val_loss: 42950511.7518 Epoch 208/400 2320/2320 [==============================] - 0s 11us/step - loss: 46303219.6414 - val_loss: 42616570.1099 Epoch 209/400 2320/2320 [==============================] - 0s 14us/step - loss: 45841471.0621 - val_loss: 42246469.1041 Epoch 210/400 2320/2320 [==============================] - 0s 14us/step - loss: 45582371.9172 - val_loss: 42173666.7098 Epoch 211/400 2320/2320 [==============================] - 0s 14us/step - loss: 45438927.0207 - val_loss: 41710410.9270 Epoch 212/400 2320/2320 [==============================] - 0s 13us/step - loss: 44779327.2828 - val_loss: 41360147.6871 Epoch 213/400 2320/2320 [==============================] - 0s 13us/step - loss: 44422166.9793 - val_loss: 41044903.8552 Epoch 214/400 2320/2320 [==============================] - 0s 14us/step - loss: 44213724.5793 - val_loss: 40771435.1752 Epoch 215/400 2320/2320 [==============================] - 0s 14us/step - loss: 44138416.8276 - val_loss: 40532924.6335 Epoch 216/400 2320/2320 [==============================] - 0s 14us/step - loss: 43652346.6759 - val_loss: 40197358.1487 Epoch 217/400 2320/2320 [==============================] - 0s 13us/step - loss: 43378427.9172 - val_loss: 40026223.8087 Epoch 218/400 2320/2320 [==============================] - 0s 14us/step - loss: 43053747.1724 - val_loss: 39693213.8125 Epoch 219/400 2320/2320 [==============================] - 0s 14us/step - loss: 42805818.7034 - val_loss: 39533024.6568 Epoch 220/400 2320/2320 [==============================] - 0s 14us/step - loss: 42512055.8897 - val_loss: 39339772.1991 Epoch 221/400 2320/2320 [==============================] - 0s 14us/step - loss: 42902809.9586 - val_loss: 39114980.1939 Epoch 222/400 2320/2320 [==============================] - 0s 14us/step - loss: 42159402.2207 - val_loss: 39210501.7453 Epoch 223/400 2320/2320 [==============================] - 0s 14us/step - loss: 41973395.1034 - val_loss: 38728934.9037 Epoch 224/400 2320/2320 [==============================] - 0s 14us/step - loss: 41696686.7034 - val_loss: 38521521.9496 Epoch 225/400 2320/2320 [==============================] - 0s 14us/step - loss: 41644919.3655 - val_loss: 38371142.1073 Epoch 226/400 2320/2320 [==============================] - 0s 13us/step - loss: 41388023.9724 - val_loss: 38206681.1687 Epoch 227/400 2320/2320 [==============================] - 0s 14us/step - loss: 41316525.6966 - val_loss: 38075111.0847 Epoch 228/400 2320/2320 [==============================] - 0s 13us/step - loss: 41201168.8828 - val_loss: 37934882.8856 Epoch 229/400 2320/2320 [==============================] - 0s 13us/step - loss: 40905454.1931 - val_loss: 37752110.0763 Epoch 230/400 2320/2320 [==============================] - 0s 13us/step - loss: 40927137.3241 - val_loss: 37806736.3878 Epoch 231/400 2320/2320 [==============================] - 0s 14us/step - loss: 40599774.6207 - val_loss: 37420327.5398 Epoch 232/400 2320/2320 [==============================] - 0s 13us/step - loss: 40555099.8345 - val_loss: 37386012.1370 Epoch 233/400 2320/2320 [==============================] - 0s 13us/step - loss: 40274589.1586 - val_loss: 37162830.8003 Epoch 234/400 2320/2320 [==============================] - 0s 14us/step - loss: 40185620.6621 - val_loss: 37257060.5094 Epoch 235/400 2320/2320 [==============================] - 0s 14us/step - loss: 40136667.1586 - val_loss: 36904105.5152 Epoch 236/400 2320/2320 [==============================] - 0s 13us/step - loss: 39915641.5724 - val_loss: 36985906.4357 Epoch 237/400 2320/2320 [==============================] - 0s 14us/step - loss: 39842094.4828 - val_loss: 36887627.9509 Epoch 238/400 2320/2320 [==============================] - 0s 14us/step - loss: 39751885.8483 - val_loss: 36591226.0995 Epoch 239/400 2320/2320 [==============================] - 0s 13us/step - loss: 39527999.2828 - val_loss: 36456264.3672 Epoch 240/400 2320/2320 [==============================] - 0s 13us/step - loss: 39365415.0069 - val_loss: 36341989.5488 Epoch 241/400 2320/2320 [==============================] - 0s 13us/step - loss: 39531440.4414 - val_loss: 36407812.2870 Epoch 242/400 2320/2320 [==============================] - 0s 13us/step - loss: 39216300.7724 - val_loss: 36256922.1357 Epoch 243/400 2320/2320 [==============================] - 0s 13us/step - loss: 39140121.0483 - val_loss: 36264025.6703 Epoch 244/400 2320/2320 [==============================] - 0s 13us/step - loss: 39123113.9310 - val_loss: 36396612.8610 Epoch 245/400 2320/2320 [==============================] - 0s 14us/step - loss: 39319001.7103 - val_loss: 36016023.2708 Epoch 246/400 2320/2320 [==============================] - 0s 14us/step - loss: 38834740.0000 - val_loss: 35818043.2321 Epoch 247/400 2320/2320 [==============================] - 0s 18us/step - loss: 38766506.6483 - val_loss: 35763381.9884 Epoch 248/400 2320/2320 [==============================] - 0s 14us/step - loss: 38742961.2138 - val_loss: 35815431.3381 Epoch 249/400 2320/2320 [==============================] - 0s 15us/step - loss: 38652035.6000 - val_loss: 35719925.6419 Epoch 250/400 2320/2320 [==============================] - 0s 14us/step - loss: 38665787.8897 - val_loss: 35751982.0814 Epoch 251/400 2320/2320 [==============================] - 0s 16us/step - loss: 38639403.5310 - val_loss: 35495833.4842 Epoch 252/400 2320/2320 [==============================] - 0s 17us/step - loss: 38360430.0966 - val_loss: 35494708.1836 Epoch 253/400 2320/2320 [==============================] - 0s 17us/step - loss: 38329326.2897 - val_loss: 35376843.8474 Epoch 254/400 2320/2320 [==============================] - 0s 13us/step - loss: 38404280.2483 - val_loss: 35286402.9528 Epoch 255/400 2320/2320 [==============================] - 0s 13us/step - loss: 38220307.7241 - val_loss: 35218477.4971 Epoch 256/400 2320/2320 [==============================] - 0s 13us/step - loss: 38196547.3103 - val_loss: 35243154.8597 Epoch 257/400 2320/2320 [==============================] - 0s 14us/step - loss: 38229593.4345 - val_loss: 35151491.7233 Epoch 258/400 2320/2320 [==============================] - 0s 14us/step - loss: 38135726.4000 - val_loss: 35241363.0045 Epoch 259/400 2320/2320 [==============================] - 0s 14us/step - loss: 38146917.0759 - val_loss: 35088932.5766 Epoch 260/400 2320/2320 [==============================] - 0s 14us/step - loss: 38101042.0966 - val_loss: 35088225.7324 Epoch 261/400 2320/2320 [==============================] - 0s 14us/step - loss: 37966409.9034 - val_loss: 35017715.3769 Epoch 262/400 2320/2320 [==============================] - 0s 13us/step - loss: 37963524.7034 - val_loss: 35072214.7692 Epoch 263/400 2320/2320 [==============================] - 0s 13us/step - loss: 37965194.4966 - val_loss: 34951020.6490 Epoch 264/400 2320/2320 [==============================] - 0s 13us/step - loss: 37949192.6897 - val_loss: 34892466.6322 Epoch 265/400 2320/2320 [==============================] - 0s 13us/step - loss: 37897004.9655 - val_loss: 34886039.2243 Epoch 266/400 2320/2320 [==============================] - 0s 14us/step - loss: 37804744.0276 - val_loss: 34807264.6981 Epoch 267/400 2320/2320 [==============================] - 0s 13us/step - loss: 37789166.9931 - val_loss: 34776045.1092 Epoch 268/400 2320/2320 [==============================] - 0s 13us/step - loss: 37753613.6828 - val_loss: 34941263.1622 Epoch 269/400 2320/2320 [==============================] - 0s 13us/step - loss: 37841328.7448 - val_loss: 34687052.2043 Epoch 270/400 2320/2320 [==============================] - 0s 13us/step - loss: 37711833.2966 - val_loss: 34719987.0873 Epoch 271/400 2320/2320 [==============================] - 0s 13us/step - loss: 37907544.8000 - val_loss: 34659465.5048 Epoch 272/400 2320/2320 [==============================] - 0s 13us/step - loss: 37814503.8069 - val_loss: 34762689.7789 Epoch 273/400 2320/2320 [==============================] - 0s 15us/step - loss: 37840061.0759 - val_loss: 34628798.8106 Epoch 274/400 2320/2320 [==============================] - 0s 13us/step - loss: 37530262.9241 - val_loss: 34616459.3407 Epoch 275/400 2320/2320 [==============================] - 0s 13us/step - loss: 37581231.5310 - val_loss: 34694123.5889 Epoch 276/400 2320/2320 [==============================] - 0s 14us/step - loss: 37548786.5103 - val_loss: 34582598.1694 Epoch 277/400 2320/2320 [==============================] - 0s 13us/step - loss: 37481253.5172 - val_loss: 34690660.5352 Epoch 278/400 2320/2320 [==============================] - 0s 13us/step - loss: 37665598.2621 - val_loss: 34532064.3672 Epoch 279/400 2320/2320 [==============================] - 0s 14us/step - loss: 37665230.0966 - val_loss: 34571819.4389 Epoch 280/400 2320/2320 [==============================] - 0s 13us/step - loss: 37503556.8828 - val_loss: 34524653.8591 Epoch 281/400 2320/2320 [==============================] - 0s 14us/step - loss: 37394554.2069 - val_loss: 34482183.8759 Epoch 282/400 2320/2320 [==============================] - 0s 14us/step - loss: 37419913.9862 - val_loss: 34557564.7162 Epoch 283/400 2320/2320 [==============================] - 0s 13us/step - loss: 37402107.8621 - val_loss: 34566170.1668 Epoch 284/400 2320/2320 [==============================] - 0s 14us/step - loss: 37490609.5586 - val_loss: 34498281.2670 Epoch 285/400 2320/2320 [==============================] - 0s 14us/step - loss: 37399269.7103 - val_loss: 34543533.2437 Epoch 286/400 2320/2320 [==============================] - 0s 14us/step - loss: 37317310.3138 - val_loss: 34582600.7860 Epoch 287/400 2320/2320 [==============================] - 0s 13us/step - loss: 37430582.7310 - val_loss: 34446013.4764 Epoch 288/400 2320/2320 [==============================] - 0s 14us/step - loss: 37401147.8483 - val_loss: 34416112.6412 Epoch 289/400 2320/2320 [==============================] - 0s 13us/step - loss: 37422196.7724 - val_loss: 34427034.9735 Epoch 290/400 2320/2320 [==============================] - 0s 14us/step - loss: 37264090.2621 - val_loss: 34425301.0368 Epoch 291/400 2320/2320 [==============================] - 0s 14us/step - loss: 37341186.7862 - val_loss: 34408902.8313 Epoch 292/400 2320/2320 [==============================] - 0s 14us/step - loss: 37351528.3310 - val_loss: 34363874.8804 Epoch 293/400 2320/2320 [==============================] - 0s 14us/step - loss: 37279534.2345 - val_loss: 34370474.5908 Epoch 294/400 2320/2320 [==============================] - 0s 14us/step - loss: 37476197.4897 - val_loss: 34352504.3310 Epoch 295/400 2320/2320 [==============================] - 0s 14us/step - loss: 37309985.3517 - val_loss: 34331654.5986 Epoch 296/400 2320/2320 [==============================] - 0s 14us/step - loss: 37357360.2069 - val_loss: 34334305.6134 Epoch 297/400 2320/2320 [==============================] - 0s 13us/step - loss: 37264471.1448 - val_loss: 34318996.9593 Epoch 298/400 2320/2320 [==============================] - 0s 13us/step - loss: 37214641.6276 - val_loss: 34315903.3174 Epoch 299/400 2320/2320 [==============================] - 0s 13us/step - loss: 37281564.4690 - val_loss: 34306205.8436 Epoch 300/400 2320/2320 [==============================] - 0s 13us/step - loss: 37267871.6690 - val_loss: 34338516.2560 Epoch 301/400 2320/2320 [==============================] - 0s 13us/step - loss: 37221989.8483 - val_loss: 34332920.7550 Epoch 302/400 2320/2320 [==============================] - 0s 13us/step - loss: 37309714.8690 - val_loss: 34367962.0840 Epoch 303/400 2320/2320 [==============================] - 0s 13us/step - loss: 37319621.1172 - val_loss: 34291375.8242 Epoch 304/400 2320/2320 [==============================] - 0s 13us/step - loss: 37261451.9724 - val_loss: 34333646.4952 Epoch 305/400 2320/2320 [==============================] - 0s 14us/step - loss: 37337321.5172 - val_loss: 34519360.1189 Epoch 306/400 2320/2320 [==============================] - 0s 13us/step - loss: 37797711.4207 - val_loss: 34293949.2851 Epoch 307/400 2320/2320 [==============================] - 0s 13us/step - loss: 37239627.5034 - val_loss: 34456994.8959 Epoch 308/400 2320/2320 [==============================] - 0s 13us/step - loss: 37241086.0276 - val_loss: 34207881.0032 Epoch 309/400 2320/2320 [==============================] - 0s 14us/step - loss: 37178599.3241 - val_loss: 34231021.7143 Epoch 310/400 2320/2320 [==============================] - 0s 13us/step - loss: 37148533.0897 - val_loss: 34250727.8397 Epoch 311/400 2320/2320 [==============================] - 0s 14us/step - loss: 37287162.1241 - val_loss: 34290145.4066 Epoch 312/400 2320/2320 [==============================] - 0s 13us/step - loss: 37357671.5310 - val_loss: 34338942.9502 Epoch 313/400 2320/2320 [==============================] - 0s 13us/step - loss: 37236610.3724 - val_loss: 34426764.4111 Epoch 314/400 2320/2320 [==============================] - 0s 13us/step - loss: 37300566.8414 - val_loss: 34343482.9787 Epoch 315/400 2320/2320 [==============================] - 0s 13us/step - loss: 37266379.9172 - val_loss: 34255113.7169 Epoch 316/400 2320/2320 [==============================] - 0s 13us/step - loss: 37199043.4759 - val_loss: 34307502.1125 Epoch 317/400 2320/2320 [==============================] - 0s 14us/step - loss: 37300144.1655 - val_loss: 34228037.8849 Epoch 318/400 2320/2320 [==============================] - 0s 14us/step - loss: 37167257.7103 - val_loss: 34329986.1564 Epoch 319/400 2320/2320 [==============================] - 0s 14us/step - loss: 37254335.2552 - val_loss: 34351601.7169 Epoch 320/400 2320/2320 [==============================] - 0s 13us/step - loss: 37208976.5241 - val_loss: 34458932.4628 Epoch 321/400 2320/2320 [==============================] - 0s 13us/step - loss: 37206688.9379 - val_loss: 34219701.3885 Epoch 322/400 2320/2320 [==============================] - 0s 14us/step - loss: 37153110.5931 - val_loss: 34346182.2366 Epoch 323/400 2320/2320 [==============================] - 0s 14us/step - loss: 37213276.7310 - val_loss: 34219063.2967 Epoch 324/400 2320/2320 [==============================] - 0s 13us/step - loss: 37199581.0207 - val_loss: 34195362.5546 Epoch 325/400 2320/2320 [==============================] - 0s 14us/step - loss: 37153695.7655 - val_loss: 34258913.1222 Epoch 326/400 2320/2320 [==============================] - 0s 13us/step - loss: 37447086.0138 - val_loss: 34255861.3006 Epoch 327/400 2320/2320 [==============================] - 0s 13us/step - loss: 37411525.9034 - val_loss: 34180837.6626 Epoch 328/400 2320/2320 [==============================] - 0s 13us/step - loss: 37199081.3931 - val_loss: 34379760.9153 Epoch 329/400 2320/2320 [==============================] - 0s 13us/step - loss: 37198804.9103 - val_loss: 34186951.3019 Epoch 330/400 2320/2320 [==============================] - 0s 13us/step - loss: 37192206.3724 - val_loss: 34261404.9489 Epoch 331/400 2320/2320 [==============================] - 0s 14us/step - loss: 37287258.4552 - val_loss: 34255247.8190 Epoch 332/400 2320/2320 [==============================] - 0s 13us/step - loss: 37195243.5034 - val_loss: 34455807.6897 Epoch 333/400 2320/2320 [==============================] - 0s 13us/step - loss: 37164294.4828 - val_loss: 34252080.9722 Epoch 334/400 2320/2320 [==============================] - 0s 14us/step - loss: 37250704.1655 - val_loss: 34363895.4984 Epoch 335/400 2320/2320 [==============================] - 0s 14us/step - loss: 37384957.0897 - val_loss: 34373193.2825 Epoch 336/400 2320/2320 [==============================] - 0s 13us/step - loss: 37163218.4966 - val_loss: 34257302.2521 Epoch 337/400 2320/2320 [==============================] - 0s 13us/step - loss: 37171180.9655 - val_loss: 34215642.5960 Epoch 338/400 2320/2320 [==============================] - 0s 14us/step - loss: 37205566.0966 - val_loss: 34229473.7376 Epoch 339/400 2320/2320 [==============================] - 0s 13us/step - loss: 37135451.0345 - val_loss: 34264684.2094 Epoch 340/400 2320/2320 [==============================] - 0s 13us/step - loss: 37299768.6069 - val_loss: 34400271.4570 Epoch 341/400 2320/2320 [==============================] - 0s 14us/step - loss: 37179645.6276 - val_loss: 34537214.4796 Epoch 342/400 2320/2320 [==============================] - 0s 14us/step - loss: 37113127.5310 - val_loss: 34422085.3988 Epoch 343/400 2320/2320 [==============================] - 0s 14us/step - loss: 37190465.4069 - val_loss: 34352877.4402 Epoch 344/400 2320/2320 [==============================] - 0s 13us/step - loss: 37266743.6690 - val_loss: 34323652.8662 Epoch 345/400 2320/2320 [==============================] - 0s 14us/step - loss: 37158884.3586 - val_loss: 34280418.3633 Epoch 346/400 2320/2320 [==============================] - 0s 14us/step - loss: 37178669.1862 - val_loss: 34177595.2062 Epoch 347/400 2320/2320 [==============================] - 0s 14us/step - loss: 37330586.2345 - val_loss: 34261753.5617 Epoch 348/400 2320/2320 [==============================] - 0s 14us/step - loss: 37357471.8345 - val_loss: 34189766.0970 Epoch 349/400 2320/2320 [==============================] - 0s 13us/step - loss: 37075490.1655 - val_loss: 34140209.2928 Epoch 350/400 2320/2320 [==============================] - 0s 13us/step - loss: 37060960.6897 - val_loss: 34167807.2295 Epoch 351/400 2320/2320 [==============================] - 0s 13us/step - loss: 37089527.3655 - val_loss: 34224193.6290 Epoch 352/400 2320/2320 [==============================] - 0s 13us/step - loss: 37107992.2759 - val_loss: 34452761.5410 Epoch 353/400 2320/2320 [==============================] - 0s 13us/step - loss: 37566466.7862 - val_loss: 34292784.2224 Epoch 354/400 2320/2320 [==============================] - 0s 13us/step - loss: 37173279.0207 - val_loss: 34126832.5999 Epoch 355/400 2320/2320 [==============================] - 0s 13us/step - loss: 37119280.2759 - val_loss: 34114110.9089 Epoch 356/400 2320/2320 [==============================] - 0s 13us/step - loss: 37066889.3241 - val_loss: 34113838.5624 Epoch 357/400 2320/2320 [==============================] - 0s 13us/step - loss: 37086532.7724 - val_loss: 34092465.5566 Epoch 358/400 2320/2320 [==============================] - 0s 13us/step - loss: 37193726.9517 - val_loss: 34108050.9683 Epoch 359/400 2320/2320 [==============================] - 0s 13us/step - loss: 37086323.7517 - val_loss: 34149160.6412 Epoch 360/400 2320/2320 [==============================] - 0s 14us/step - loss: 37388549.7103 - val_loss: 34117261.2954 Epoch 361/400 2320/2320 [==============================] - 0s 14us/step - loss: 37417152.4966 - val_loss: 34110743.5863 Epoch 362/400 2320/2320 [==============================] - 0s 14us/step - loss: 37099796.3655 - val_loss: 34200353.6393 Epoch 363/400 2320/2320 [==============================] - 0s 14us/step - loss: 37134240.4828 - val_loss: 34309104.7447 Epoch 364/400 2320/2320 [==============================] - 0s 13us/step - loss: 37068657.8759 - val_loss: 34112457.2204 Epoch 365/400 2320/2320 [==============================] - 0s 13us/step - loss: 37286524.9103 - val_loss: 34154262.0814 Epoch 366/400 2320/2320 [==============================] - 0s 13us/step - loss: 36978258.1448 - val_loss: 34070012.7886 Epoch 367/400 2320/2320 [==============================] - 0s 13us/step - loss: 37104549.9310 - val_loss: 34195093.3730 Epoch 368/400 2320/2320 [==============================] - 0s 14us/step - loss: 37214213.5724 - val_loss: 34135214.1487 Epoch 369/400 2320/2320 [==============================] - 0s 13us/step - loss: 37079849.3517 - val_loss: 34227339.5682 Epoch 370/400 2320/2320 [==============================] - 0s 13us/step - loss: 37122455.4483 - val_loss: 34236347.7854 Epoch 371/400 2320/2320 [==============================] - 0s 14us/step - loss: 37147324.7724 - val_loss: 34164597.4299 Epoch 372/400 2320/2320 [==============================] - 0s 14us/step - loss: 37127598.5241 - val_loss: 34080434.8752 Epoch 373/400 2320/2320 [==============================] - 0s 13us/step - loss: 37034225.2138 - val_loss: 34128858.7098 Epoch 374/400 2320/2320 [==============================] - 0s 14us/step - loss: 37106707.8552 - val_loss: 34113886.4486 Epoch 375/400 2320/2320 [==============================] - 0s 13us/step - loss: 37036707.6414 - val_loss: 34118456.1241 Epoch 376/400 2320/2320 [==============================] - 0s 13us/step - loss: 37078553.9724 - val_loss: 34206136.2482 Epoch 377/400 2320/2320 [==============================] - 0s 14us/step - loss: 37110706.4000 - val_loss: 34189491.6251 Epoch 378/400 2320/2320 [==============================] - 0s 13us/step - loss: 37076395.3655 - val_loss: 34133580.8662 Epoch 379/400 2320/2320 [==============================] - 0s 13us/step - loss: 37039784.3862 - val_loss: 34062406.8054 Epoch 380/400 2320/2320 [==============================] - 0s 14us/step - loss: 36982698.3172 - val_loss: 34065063.3484 Epoch 381/400 2320/2320 [==============================] - 0s 14us/step - loss: 37049844.6759 - val_loss: 34187189.0110 Epoch 382/400 2320/2320 [==============================] - 0s 13us/step - loss: 37016272.5517 - val_loss: 34079144.4809 Epoch 383/400 2320/2320 [==============================] - 0s 14us/step - loss: 37074914.9793 - val_loss: 34034872.9101 Epoch 384/400 2320/2320 [==============================] - 0s 13us/step - loss: 37051522.4276 - val_loss: 34047555.9871 Epoch 385/400 2320/2320 [==============================] - 0s 13us/step - loss: 37306956.8966 - val_loss: 34055474.4822 Epoch 386/400 2320/2320 [==============================] - 0s 13us/step - loss: 37008899.9586 - val_loss: 34121313.4997 Epoch 387/400 2320/2320 [==============================] - 0s 13us/step - loss: 37039758.2621 - val_loss: 34070716.2922 Epoch 388/400 2320/2320 [==============================] - 0s 14us/step - loss: 37017173.9586 - val_loss: 34079357.7246 Epoch 389/400 2320/2320 [==============================] - 0s 13us/step - loss: 37038540.1931 - val_loss: 34282006.1383 Epoch 390/400 2320/2320 [==============================] - 0s 13us/step - loss: 37103587.0931 - val_loss: 34071103.7673 Epoch 391/400 2320/2320 [==============================] - 0s 15us/step - loss: 36996375.3103 - val_loss: 34062853.7660 Epoch 392/400 2320/2320 [==============================] - 0s 14us/step - loss: 37021419.3655 - val_loss: 34220043.9457 Epoch 393/400 2320/2320 [==============================] - 0s 14us/step - loss: 37086075.2552 - val_loss: 34454166.8520 Epoch 394/400 2320/2320 [==============================] - 0s 14us/step - loss: 37072894.4759 - val_loss: 34205620.5611 Epoch 395/400 2320/2320 [==============================] - 0s 13us/step - loss: 36895375.4621 - val_loss: 34185946.5960 Epoch 396/400 2320/2320 [==============================] - 0s 13us/step - loss: 37017036.3586 - val_loss: 34079028.4732 Epoch 397/400 2320/2320 [==============================] - 0s 13us/step - loss: 36990243.5862 - val_loss: 34409525.1299 Epoch 398/400 2320/2320 [==============================] - 0s 14us/step - loss: 37427831.7241 - val_loss: 34036977.4169 Epoch 399/400 2320/2320 [==============================] - 0s 14us/step - loss: 36972471.6414 - val_loss: 34045050.2857 Epoch 400/400 2320/2320 [==============================] - 0s 13us/step - loss: 37038571.8345 - val_loss: 34139546.9787
losses = pd.DataFrame(model.history.history)
losses.plot(figsize=(20,8))
<AxesSubplot:>
predictions = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4343.0254322883 MSE: 34139546.34012055
RMSE: 5842.905641897749
MAPE: 22.62241430476337
Merging 'weather_data' to 'daily_data' as 'final_data'
final_data=data_daily.merge(weatherdata, on='Date', how='inner')
final_data.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | ... | cloudcover | humidity | precipMM | pressure | tempC | visibility | winddirDegree | windspeedKmph | location | Dates | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | ... | 39 | 70 | 0.0 | 1015 | 24 | 9 | 269 | 9 | London | 2010-07-30 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | ... | 74 | 81 | 2.0 | 1012 | 22 | 8 | 237 | 12 | London | 2010-07-31 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | ... | 57 | 77 | 2.9 | 1013 | 22 | 9 | 257 | 8 | London | 2010-08-01 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | ... | 33 | 75 | 7.3 | 1017 | 22 | 9 | 257 | 8 | London | 2010-08-02 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | ... | 37 | 77 | 0.1 | 1016 | 23 | 7 | 252 | 9 | London | 2010-08-03 |
5 rows × 36 columns
final_data.columns
Index(['Date', 'DAY', 'DAY_NUMBER', 'WEEKDAY_WEEKEND', 'ST_DAY', 'ST_MONTH',
'Month', 'ST_YEAR', 'Non_work_day', 'Wkday_Wend',
'Number_of_Bicycle_Hires', 'maxtempC', 'mintempC', 'totalSnow_cm',
'sunHour', 'uvIndex', 'moon_illumination', 'moonrise', 'moonset',
'sunrise', 'sunset', 'DewPointC', 'FeelsLikeC', 'HeatIndexC',
'WindChillC', 'WindGustKmph', 'cloudcover', 'humidity', 'precipMM',
'pressure', 'tempC', 'visibility', 'winddirDegree', 'windspeedKmph',
'location', 'Dates'],
dtype='object')
Merging with weather data adds a number of features to the data. While some of them look very important, the others doesnt look like have much importance to the data (for example : moon_illumination). To leave nothing to chance, data correlation was plotted and only features with high correaltion were considered for the analysis.
data_corr = final_data.corr()
mask = np.array(data_corr)
sns.set_context('poster',font_scale=0.3)
mask[np.tril_indices_from(mask)] = False
fig = plt.subplots(figsize=(20,20))
sns.heatmap(data_corr, mask=mask, vmax=1, square=True, annot=True, cmap='coolwarm');
Observation:
From the correaltion heatmap, it can be seen that the Number of Bicycle Hires are positively correlated to
One Hot Encoding :
all_data = pd.concat([final_data, pd.get_dummies(final_data.ST_YEAR, prefix = 'Time')],axis = 1)
# all_data = pd.concat([all_data, pd.get_dummies(final_data['WEEKDAY_WEEKEND'], prefix = 'W')],axis = 1)
all_data.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | ... | Time_2012 | Time_2013 | Time_2014 | Time_2015 | Time_2016 | Time_2017 | Time_2018 | Time_2019 | Time_2020 | Time_2021 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 48 columns
Linear Regression
X = all_data[['ST_DAY', 'ST_MONTH','maxtempC','mintempC','sunHour', 'uvIndex','FeelsLikeC',
'HeatIndexC', 'WindChillC','tempC','pressure','visibility','Time_2010', 'Time_2011',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend','Non_work_day']]
y = all_data['Number_of_Bicycle_Hires']
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
from sklearn.linear_model import LinearRegression
lm = LinearRegression()
lm.fit(X_train,y_train)
predictions = lm.predict(X_test)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
MAE: 3553.3399729147895 MSE: 24499071.457542036
RMSE: 4949.653670464433
MAPE: 19.163943910956394
Decision Tree
from sklearn.tree import DecisionTreeRegressor
dt = DecisionTreeRegressor(random_state=0)
dt_params = {'max_depth':np.arange(1,50,2),'min_samples_leaf':np.arange(2,15)}
from sklearn.model_selection import GridSearchCV
gs_dt = GridSearchCV(dt,dt_params,cv=3)
gs_dt.fit(X_train,y_train)
a = gs_dt.best_params_
# Training with best parameters
# from sklearn.tree import DecisionTreeRegressor
dtr=DecisionTreeRegressor(max_depth=a['max_depth'],min_samples_leaf= a['min_samples_leaf'])
model = dtr.fit(X_train,y_train)
predictions = model.predict(X_test)
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
MAE: 4136.567491957212 MSE: 29763984.521982096
RMSE: 5455.637865729552
MAPE: 21.123000722497167
Random Forest
from sklearn.ensemble import RandomForestRegressor
rf = RandomForestRegressor(random_state=0)
rf_params = {'n_estimators':np.arange(25,150,25),'max_depth':np.arange(1,11,2),
'min_samples_leaf':np.arange(2,15,3)}
from sklearn.model_selection import GridSearchCV
gs_rf = GridSearchCV(rf,rf_params,cv=3)
gs_rf.fit(X_train,y_train)
b = gs_rf.best_params_
RF = RandomForestRegressor(n_estimators=b['n_estimators'],max_depth=b['max_depth'],
min_samples_leaf=b['min_samples_leaf'],random_state=0)
model = RF.fit(X_train,y_train)
pred = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, pred))
print('MSE:', metrics.mean_squared_error(y_test, pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 3254.851110552001 MSE: 18612487.542834166
RMSE: 4314.219227488813
MAPE: 16.70368490395965
AdaBoostRegressor
from sklearn.ensemble import AdaBoostRegressor
ar = AdaBoostRegressor(base_estimator=RF,random_state=0)
ar_params = {'n_estimators':np.arange(25,200,25)}
from sklearn.model_selection import GridSearchCV
gs_ar = GridSearchCV(ar,ar_params,cv=3)
gs_ar.fit(X_train,y_train)
c = gs_ar.best_params_
# Fitting the model with best params
ab_rf = AdaBoostRegressor(base_estimator=RF,n_estimators=c['n_estimators'],random_state=0)
model = ab_rf.fit(X_train,y_train)
y_pred = model.predict(X_test);
print('MAE:', metrics.mean_absolute_error(y_test, y_pred))
print('MSE:', metrics.mean_squared_error(y_test, y_pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,y_pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
y_pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,y_pred,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 3303.6345511464206 MSE: 18012943.55109511
RMSE: 4244.165825117477
MAPE: 16.43245391221837
Neural Network
model = Sequential()
model.add(Dense(50,activation='relu'))
model.add(Dense(25,activation='relu'))
model.add(Dense(20,activation='relu'))
model.add(Dense(10,activation='relu'))
model.add(Dense(1))
model.compile(optimizer='adam',loss='mse')
from tensorflow.keras.callbacks import EarlyStopping
early_stop = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=25)
model.fit(x=X_train.values,y=y_train.values,
validation_data=(X_test.values,y_test.values),batch_size=128,epochs=400,callbacks=[early_stop]);
Train on 2320 samples, validate on 1547 samples Epoch 1/400 2320/2320 [==============================] - 1s 256us/step - loss: 763152249.8207 - val_loss: 749360463.4312 Epoch 2/400 2320/2320 [==============================] - 0s 18us/step - loss: 759919200.6621 - val_loss: 743292358.2883 Epoch 3/400 2320/2320 [==============================] - 0s 18us/step - loss: 746730017.5448 - val_loss: 719811188.8714 Epoch 4/400 2320/2320 [==============================] - 0s 20us/step - loss: 711357694.6759 - val_loss: 665879610.7873 Epoch 5/400 2320/2320 [==============================] - 0s 18us/step - loss: 633695086.3448 - val_loss: 553713077.2644 Epoch 6/400 2320/2320 [==============================] - 0s 19us/step - loss: 485747692.2483 - val_loss: 363752969.3704 Epoch 7/400 2320/2320 [==============================] - 0s 18us/step - loss: 272457326.3448 - val_loss: 151208972.5507 Epoch 8/400 2320/2320 [==============================] - 0s 17us/step - loss: 107192949.1310 - val_loss: 81479795.3691 Epoch 9/400 2320/2320 [==============================] - 0s 17us/step - loss: 86307668.7724 - val_loss: 81410686.3348 Epoch 10/400 2320/2320 [==============================] - 0s 17us/step - loss: 84248171.8621 - val_loss: 80154946.1202 Epoch 11/400 2320/2320 [==============================] - 0s 16us/step - loss: 83806639.0621 - val_loss: 79770030.5133 Epoch 12/400 2320/2320 [==============================] - 0s 17us/step - loss: 83223405.3517 - val_loss: 79180521.0472 Epoch 13/400 2320/2320 [==============================] - 0s 17us/step - loss: 82703347.6414 - val_loss: 78653656.5223 Epoch 14/400 2320/2320 [==============================] - 0s 17us/step - loss: 82267061.8207 - val_loss: 78158430.9528 Epoch 15/400 2320/2320 [==============================] - 0s 19us/step - loss: 81705889.2690 - val_loss: 77803415.7750 Epoch 16/400 2320/2320 [==============================] - 0s 18us/step - loss: 81261743.4483 - val_loss: 77179560.7498 Epoch 17/400 2320/2320 [==============================] - 0s 17us/step - loss: 80894841.5724 - val_loss: 76728556.9981 Epoch 18/400 2320/2320 [==============================] - 0s 17us/step - loss: 80479034.4828 - val_loss: 76170730.8054 Epoch 19/400 2320/2320 [==============================] - 0s 18us/step - loss: 79821394.1517 - val_loss: 75751453.9108 Epoch 20/400 2320/2320 [==============================] - 0s 16us/step - loss: 79567084.6621 - val_loss: 75150473.3445 Epoch 21/400 2320/2320 [==============================] - 0s 18us/step - loss: 78711303.5034 - val_loss: 74901785.6186 Epoch 22/400 2320/2320 [==============================] - 0s 16us/step - loss: 78363219.9172 - val_loss: 74130800.1836 Epoch 23/400 2320/2320 [==============================] - 0s 18us/step - loss: 77763522.2621 - val_loss: 73642066.3685 Epoch 24/400 2320/2320 [==============================] - 0s 17us/step - loss: 77241059.4207 - val_loss: 73081592.4913 Epoch 25/400 2320/2320 [==============================] - 0s 17us/step - loss: 76618183.2828 - val_loss: 72488158.8933 Epoch 26/400 2320/2320 [==============================] - 0s 18us/step - loss: 76076410.4828 - val_loss: 71943219.2062 Epoch 27/400 2320/2320 [==============================] - 0s 17us/step - loss: 75486163.8069 - val_loss: 71444887.1803 Epoch 28/400 2320/2320 [==============================] - 0s 18us/step - loss: 74995584.4966 - val_loss: 70876394.6425 Epoch 29/400 2320/2320 [==============================] - 0s 17us/step - loss: 74375835.8621 - val_loss: 70204687.0278 Epoch 30/400 2320/2320 [==============================] - 0s 18us/step - loss: 73772361.7103 - val_loss: 69595393.4531 Epoch 31/400 2320/2320 [==============================] - 0s 19us/step - loss: 73161651.7517 - val_loss: 68974770.3943 Epoch 32/400 2320/2320 [==============================] - 0s 17us/step - loss: 72585054.8966 - val_loss: 68371208.2250 Epoch 33/400 2320/2320 [==============================] - 0s 19us/step - loss: 72052204.1655 - val_loss: 67802069.8591 Epoch 34/400 2320/2320 [==============================] - 0s 17us/step - loss: 71354346.8414 - val_loss: 67100490.4977 Epoch 35/400 2320/2320 [==============================] - 0s 18us/step - loss: 70709994.6207 - val_loss: 66471782.6012 Epoch 36/400 2320/2320 [==============================] - 0s 17us/step - loss: 70042107.1448 - val_loss: 65884736.3361 Epoch 37/400 2320/2320 [==============================] - 0s 19us/step - loss: 69473968.0000 - val_loss: 65095531.8293 Epoch 38/400 2320/2320 [==============================] - 0s 18us/step - loss: 68640261.2966 - val_loss: 64430249.4609 Epoch 39/400 2320/2320 [==============================] - 0s 17us/step - loss: 68015237.1862 - val_loss: 63687484.9722 Epoch 40/400 2320/2320 [==============================] - 0s 18us/step - loss: 67323012.8000 - val_loss: 62996232.6128 Epoch 41/400 2320/2320 [==============================] - 0s 17us/step - loss: 66523774.8690 - val_loss: 62366203.7027 Epoch 42/400 2320/2320 [==============================] - 0s 18us/step - loss: 65886817.8483 - val_loss: 61591566.3452 Epoch 43/400 2320/2320 [==============================] - 0s 20us/step - loss: 65092877.5724 - val_loss: 60712941.8578 Epoch 44/400 2320/2320 [==============================] - 0s 18us/step - loss: 64206781.4621 - val_loss: 60171432.4215 Epoch 45/400 2320/2320 [==============================] - 0s 18us/step - loss: 63468427.6414 - val_loss: 59143497.0782 Epoch 46/400 2320/2320 [==============================] - 0s 17us/step - loss: 62632383.0069 - val_loss: 58296874.3154 Epoch 47/400 2320/2320 [==============================] - 0s 17us/step - loss: 61774370.6483 - val_loss: 57430582.2379 Epoch 48/400 2320/2320 [==============================] - 0s 18us/step - loss: 60998732.1379 - val_loss: 56607525.9431 Epoch 49/400 2320/2320 [==============================] - 0s 20us/step - loss: 60171640.0000 - val_loss: 55750317.4686 Epoch 50/400 2320/2320 [==============================] - 0s 20us/step - loss: 59358985.5172 - val_loss: 54909592.2366 Epoch 51/400 2320/2320 [==============================] - 0s 18us/step - loss: 58407350.9517 - val_loss: 54078880.7279 Epoch 52/400 2320/2320 [==============================] - 0s 18us/step - loss: 57535697.6552 - val_loss: 53195663.5902 Epoch 53/400 2320/2320 [==============================] - 0s 17us/step - loss: 56701134.3448 - val_loss: 52368700.6723 Epoch 54/400 2320/2320 [==============================] - 0s 19us/step - loss: 55840701.4759 - val_loss: 51477340.1526 Epoch 55/400 2320/2320 [==============================] - 0s 17us/step - loss: 55072565.8483 - val_loss: 50624063.8216 Epoch 56/400 2320/2320 [==============================] - 0s 18us/step - loss: 54156502.8414 - val_loss: 49825672.5495 Epoch 57/400 2320/2320 [==============================] - 0s 17us/step - loss: 53325583.6138 - val_loss: 49045829.4803 Epoch 58/400 2320/2320 [==============================] - 0s 18us/step - loss: 52644771.4207 - val_loss: 48488962.5611 Epoch 59/400 2320/2320 [==============================] - 0s 17us/step - loss: 52258863.7793 - val_loss: 47764289.3019 Epoch 60/400 2320/2320 [==============================] - 0s 18us/step - loss: 51345798.4276 - val_loss: 47070130.4900 Epoch 61/400 2320/2320 [==============================] - 0s 17us/step - loss: 50584454.7310 - val_loss: 46451084.2573 Epoch 62/400 2320/2320 [==============================] - 0s 19us/step - loss: 50130132.2759 - val_loss: 46160855.3963 Epoch 63/400 2320/2320 [==============================] - 0s 17us/step - loss: 49671846.2759 - val_loss: 45421627.5850 Epoch 64/400 2320/2320 [==============================] - 0s 18us/step - loss: 49158900.1379 - val_loss: 45128030.7188 Epoch 65/400 2320/2320 [==============================] - 0s 17us/step - loss: 48792205.6276 - val_loss: 44684621.1506 Epoch 66/400 2320/2320 [==============================] - 0s 19us/step - loss: 48467583.3103 - val_loss: 44457909.4040 Epoch 67/400 2320/2320 [==============================] - 0s 17us/step - loss: 48265093.7379 - val_loss: 44186498.6749 Epoch 68/400 2320/2320 [==============================] - 0s 17us/step - loss: 47989410.3724 - val_loss: 43954540.1784 Epoch 69/400 2320/2320 [==============================] - 0s 17us/step - loss: 47879136.4414 - val_loss: 43781315.9457 Epoch 70/400 2320/2320 [==============================] - 0s 18us/step - loss: 47920305.4897 - val_loss: 43713750.1797 Epoch 71/400 2320/2320 [==============================] - 0s 18us/step - loss: 47736829.0759 - val_loss: 43905250.4667 Epoch 72/400 2320/2320 [==============================] - 0s 17us/step - loss: 47651869.0207 - val_loss: 43869136.2107 Epoch 73/400 2320/2320 [==============================] - 0s 18us/step - loss: 47255239.6000 - val_loss: 43310811.8746 Epoch 74/400 2320/2320 [==============================] - 0s 17us/step - loss: 47918241.8207 - val_loss: 43231946.0595 Epoch 75/400 2320/2320 [==============================] - 0s 18us/step - loss: 47815756.8000 - val_loss: 43270150.9101 Epoch 76/400 2320/2320 [==============================] - 0s 17us/step - loss: 47115020.1103 - val_loss: 43662987.5617 Epoch 77/400 2320/2320 [==============================] - 0s 18us/step - loss: 47367320.4276 - val_loss: 43221508.9244 Epoch 78/400 2320/2320 [==============================] - 0s 18us/step - loss: 47047233.0759 - val_loss: 42984492.6671 Epoch 79/400 2320/2320 [==============================] - 0s 18us/step - loss: 46903957.8483 - val_loss: 42985117.1765 Epoch 80/400 2320/2320 [==============================] - 0s 17us/step - loss: 46802556.1379 - val_loss: 42948394.7511 Epoch 81/400 2320/2320 [==============================] - 0s 18us/step - loss: 46703173.2414 - val_loss: 42833632.4021 Epoch 82/400 2320/2320 [==============================] - 0s 17us/step - loss: 46711117.6828 - val_loss: 42705753.2760 Epoch 83/400 2320/2320 [==============================] - 0s 18us/step - loss: 46671861.0483 - val_loss: 42879286.7395 Epoch 84/400 2320/2320 [==============================] - 0s 17us/step - loss: 46714636.3862 - val_loss: 42705430.6335 Epoch 85/400 2320/2320 [==============================] - 0s 18us/step - loss: 46713976.7724 - val_loss: 42559382.1461 Epoch 86/400 2320/2320 [==============================] - 0s 18us/step - loss: 46684496.9379 - val_loss: 42880341.3058 Epoch 87/400 2320/2320 [==============================] - 0s 18us/step - loss: 46862631.8345 - val_loss: 42488758.9451 Epoch 88/400 2320/2320 [==============================] - 0s 17us/step - loss: 46477418.5931 - val_loss: 42454138.7899 Epoch 89/400 2320/2320 [==============================] - 0s 18us/step - loss: 46475251.0897 - val_loss: 42636774.1681 Epoch 90/400 2320/2320 [==============================] - 0s 17us/step - loss: 46515226.3034 - val_loss: 42930032.8636 Epoch 91/400 2320/2320 [==============================] - 0s 17us/step - loss: 46439262.8966 - val_loss: 42360387.3782 Epoch 92/400 2320/2320 [==============================] - 0s 17us/step - loss: 46377868.6621 - val_loss: 42274898.3180 Epoch 93/400 2320/2320 [==============================] - 0s 17us/step - loss: 46316821.6276 - val_loss: 42527369.4661 Epoch 94/400 2320/2320 [==============================] - 0s 18us/step - loss: 46013517.9034 - val_loss: 42115701.5333 Epoch 95/400 2320/2320 [==============================] - 0s 17us/step - loss: 46491686.3172 - val_loss: 42217726.9444 Epoch 96/400 2320/2320 [==============================] - 0s 18us/step - loss: 46083306.8414 - val_loss: 42519621.9496 Epoch 97/400 2320/2320 [==============================] - 0s 17us/step - loss: 45955607.6966 - val_loss: 41978080.8739 Epoch 98/400 2320/2320 [==============================] - 0s 18us/step - loss: 45848869.1862 - val_loss: 41934874.1810 Epoch 99/400 2320/2320 [==============================] - 0s 17us/step - loss: 45928242.5379 - val_loss: 42010683.0834 Epoch 100/400 2320/2320 [==============================] - 0s 18us/step - loss: 45817534.2345 - val_loss: 42171489.2127 Epoch 101/400 2320/2320 [==============================] - 0s 17us/step - loss: 45674397.1310 - val_loss: 41846911.8591 Epoch 102/400 2320/2320 [==============================] - 0s 19us/step - loss: 45747883.0621 - val_loss: 41841107.6729 Epoch 103/400 2320/2320 [==============================] - 0s 17us/step - loss: 45882289.5448 - val_loss: 41881131.5320 Epoch 104/400 2320/2320 [==============================] - 0s 18us/step - loss: 45556749.2828 - val_loss: 41837324.7369 Epoch 105/400 2320/2320 [==============================] - 0s 17us/step - loss: 45444639.0069 - val_loss: 41646387.1519 Epoch 106/400 2320/2320 [==============================] - 0s 18us/step - loss: 45530484.3310 - val_loss: 42372303.8100 Epoch 107/400 2320/2320 [==============================] - 0s 17us/step - loss: 45677488.2069 - val_loss: 41862620.8985 Epoch 108/400 2320/2320 [==============================] - 0s 18us/step - loss: 45540203.5034 - val_loss: 41482026.1396 Epoch 109/400 2320/2320 [==============================] - 0s 17us/step - loss: 45268138.2897 - val_loss: 41702515.9677 Epoch 110/400 2320/2320 [==============================] - 0s 18us/step - loss: 45236613.8483 - val_loss: 41388784.8520 Epoch 111/400 2320/2320 [==============================] - 0s 17us/step - loss: 45215418.3448 - val_loss: 41344189.1946 Epoch 112/400 2320/2320 [==============================] - 0s 18us/step - loss: 45261526.9241 - val_loss: 41932552.0918 Epoch 113/400 2320/2320 [==============================] - 0s 17us/step - loss: 45281666.4552 - val_loss: 41292651.0110 Epoch 114/400 2320/2320 [==============================] - 0s 18us/step - loss: 45077798.1241 - val_loss: 41254080.4577 Epoch 115/400 2320/2320 [==============================] - 0s 17us/step - loss: 44918963.6414 - val_loss: 41185280.5650 Epoch 116/400 2320/2320 [==============================] - 0s 17us/step - loss: 44885173.1310 - val_loss: 41226835.2527 Epoch 117/400 2320/2320 [==============================] - 0s 17us/step - loss: 44892268.6483 - val_loss: 41562900.0013 Epoch 118/400 2320/2320 [==============================] - 0s 17us/step - loss: 44886514.4414 - val_loss: 41043780.7447 Epoch 119/400 2320/2320 [==============================] - 0s 18us/step - loss: 44938674.4000 - val_loss: 41118952.7964 Epoch 120/400 2320/2320 [==============================] - 0s 17us/step - loss: 45187515.1172 - val_loss: 41174756.4887 Epoch 121/400 2320/2320 [==============================] - 0s 18us/step - loss: 44878218.2069 - val_loss: 42292870.3504 Epoch 122/400 2320/2320 [==============================] - 0s 17us/step - loss: 45111341.0759 - val_loss: 40830148.8132 Epoch 123/400 2320/2320 [==============================] - 0s 18us/step - loss: 44659292.1655 - val_loss: 40840756.2780 Epoch 124/400 2320/2320 [==============================] - 0s 17us/step - loss: 44580715.8069 - val_loss: 40866330.9735 Epoch 125/400 2320/2320 [==============================] - 0s 18us/step - loss: 44400636.3034 - val_loss: 40721960.3710 Epoch 126/400 2320/2320 [==============================] - 0s 17us/step - loss: 44416119.4207 - val_loss: 41053618.7356 Epoch 127/400 2320/2320 [==============================] - 0s 18us/step - loss: 44476901.6276 - val_loss: 40661071.8423 Epoch 128/400 2320/2320 [==============================] - 0s 17us/step - loss: 44413582.1517 - val_loss: 40659853.0847 Epoch 129/400 2320/2320 [==============================] - 0s 18us/step - loss: 44217005.0759 - val_loss: 40782939.9922 Epoch 130/400 2320/2320 [==============================] - 0s 17us/step - loss: 44162865.6828 - val_loss: 40475392.3310 Epoch 131/400 2320/2320 [==============================] - 0s 18us/step - loss: 44228690.9793 - val_loss: 40449727.3019 Epoch 132/400 2320/2320 [==============================] - 0s 17us/step - loss: 44034657.8759 - val_loss: 40428654.2056 Epoch 133/400 2320/2320 [==============================] - 0s 18us/step - loss: 44323641.1586 - val_loss: 41023769.6522 Epoch 134/400 2320/2320 [==============================] - 0s 17us/step - loss: 44061042.4138 - val_loss: 40535876.9063 Epoch 135/400 2320/2320 [==============================] - 0s 18us/step - loss: 44074232.8000 - val_loss: 40619855.6833 Epoch 136/400 2320/2320 [==============================] - 0s 17us/step - loss: 43889318.7310 - val_loss: 40261383.4402 Epoch 137/400 2320/2320 [==============================] - 0s 18us/step - loss: 43766495.6138 - val_loss: 40115418.1008 Epoch 138/400 2320/2320 [==============================] - 0s 17us/step - loss: 43709993.4897 - val_loss: 40061209.7091 Epoch 139/400 2320/2320 [==============================] - 0s 18us/step - loss: 43689303.7103 - val_loss: 40677968.8559 Epoch 140/400 2320/2320 [==============================] - 0s 17us/step - loss: 43770461.4759 - val_loss: 40082709.2049 Epoch 141/400 2320/2320 [==============================] - 0s 18us/step - loss: 43602749.3241 - val_loss: 39911900.3103 Epoch 142/400 2320/2320 [==============================] - 0s 17us/step - loss: 43886612.8000 - val_loss: 40066507.7789 Epoch 143/400 2320/2320 [==============================] - 0s 18us/step - loss: 43454564.9103 - val_loss: 39895132.3426 Epoch 144/400 2320/2320 [==============================] - 0s 17us/step - loss: 43377143.1172 - val_loss: 39763694.2120 Epoch 145/400 2320/2320 [==============================] - 0s 18us/step - loss: 43371738.0138 - val_loss: 39769499.5953 Epoch 146/400 2320/2320 [==============================] - 0s 17us/step - loss: 43706230.0138 - val_loss: 39627148.0879 Epoch 147/400 2320/2320 [==============================] - 0s 17us/step - loss: 43353756.9655 - val_loss: 39577192.2353 Epoch 148/400 2320/2320 [==============================] - 0s 18us/step - loss: 43594360.7448 - val_loss: 40556081.7324 Epoch 149/400 2320/2320 [==============================] - 0s 17us/step - loss: 43185769.6138 - val_loss: 39655952.7886 Epoch 150/400 2320/2320 [==============================] - 0s 18us/step - loss: 43297689.6966 - val_loss: 39578100.4758 Epoch 151/400 2320/2320 [==============================] - 0s 17us/step - loss: 43084011.4069 - val_loss: 39400405.6484 Epoch 152/400 2320/2320 [==============================] - 0s 18us/step - loss: 42943220.9103 - val_loss: 39663846.0168 Epoch 153/400 2320/2320 [==============================] - 0s 17us/step - loss: 42962092.6345 - val_loss: 39251254.2663 Epoch 154/400 2320/2320 [==============================] - 0s 18us/step - loss: 42852983.2828 - val_loss: 39237774.1138 Epoch 155/400 2320/2320 [==============================] - 0s 17us/step - loss: 42778942.3724 - val_loss: 39559996.2780 Epoch 156/400 2320/2320 [==============================] - 0s 18us/step - loss: 42893727.1724 - val_loss: 40108142.9347 Epoch 157/400 2320/2320 [==============================] - 0s 17us/step - loss: 43097511.7793 - val_loss: 39038710.0078 Epoch 158/400 2320/2320 [==============================] - 0s 18us/step - loss: 42701193.7655 - val_loss: 39266050.7085 Epoch 159/400 2320/2320 [==============================] - 0s 17us/step - loss: 42700200.5517 - val_loss: 38939469.2747 Epoch 160/400 2320/2320 [==============================] - 0s 17us/step - loss: 42523922.0552 - val_loss: 39030943.2346 Epoch 161/400 2320/2320 [==============================] - 0s 17us/step - loss: 42573056.0276 - val_loss: 38951967.3963 Epoch 162/400 2320/2320 [==============================] - 0s 17us/step - loss: 42218396.4690 - val_loss: 38746185.6742 Epoch 163/400 2320/2320 [==============================] - 0s 17us/step - loss: 42203636.2759 - val_loss: 38724144.7705 Epoch 164/400 2320/2320 [==============================] - 0s 17us/step - loss: 42116974.0414 - val_loss: 38751305.3070 Epoch 165/400 2320/2320 [==============================] - 0s 18us/step - loss: 42071397.3517 - val_loss: 38681560.0116 Epoch 166/400 2320/2320 [==============================] - 0s 17us/step - loss: 42222780.9655 - val_loss: 38542521.9625 Epoch 167/400 2320/2320 [==============================] - 0s 18us/step - loss: 41928502.4138 - val_loss: 38540825.8112 Epoch 168/400 2320/2320 [==============================] - 0s 17us/step - loss: 41859203.6966 - val_loss: 38378487.1933 Epoch 169/400 2320/2320 [==============================] - 0s 17us/step - loss: 41862397.0759 - val_loss: 38352099.6018 Epoch 170/400 2320/2320 [==============================] - 0s 17us/step - loss: 41780844.7172 - val_loss: 38423598.0090 Epoch 171/400 2320/2320 [==============================] - 0s 17us/step - loss: 41623639.0621 - val_loss: 38162008.7628 Epoch 172/400 2320/2320 [==============================] - 0s 17us/step - loss: 41764118.5931 - val_loss: 38756244.1215 Epoch 173/400 2320/2320 [==============================] - 0s 17us/step - loss: 41453331.7517 - val_loss: 38043397.5217 Epoch 174/400 2320/2320 [==============================] - 0s 18us/step - loss: 41397546.0414 - val_loss: 38487066.9670 Epoch 175/400 2320/2320 [==============================] - 0s 17us/step - loss: 41701700.8552 - val_loss: 38227053.0511 Epoch 176/400 2320/2320 [==============================] - 0s 18us/step - loss: 41334290.8138 - val_loss: 38186400.6568 Epoch 177/400 2320/2320 [==============================] - 0s 17us/step - loss: 41288046.5103 - val_loss: 37756984.8041 Epoch 178/400 2320/2320 [==============================] - 0s 18us/step - loss: 41320031.4483 - val_loss: 37684915.7750 Epoch 179/400 2320/2320 [==============================] - 0s 18us/step - loss: 41250265.9034 - val_loss: 37841003.4182 Epoch 180/400 2320/2320 [==============================] - 0s 18us/step - loss: 41246990.3724 - val_loss: 37664968.2056 Epoch 181/400 2320/2320 [==============================] - 0s 17us/step - loss: 41098049.8207 - val_loss: 38015767.8539 Epoch 182/400 2320/2320 [==============================] - 0s 18us/step - loss: 41278118.7862 - val_loss: 37422613.6315 Epoch 183/400 2320/2320 [==============================] - 0s 16us/step - loss: 40800052.5931 - val_loss: 37376426.9800 Epoch 184/400 2320/2320 [==============================] - 0s 17us/step - loss: 40974243.2552 - val_loss: 37397792.4913 Epoch 185/400 2320/2320 [==============================] - 0s 16us/step - loss: 40840759.9448 - val_loss: 37248043.4544 Epoch 186/400 2320/2320 [==============================] - 0s 17us/step - loss: 40648351.2828 - val_loss: 37154031.8474 Epoch 187/400 2320/2320 [==============================] - 0s 17us/step - loss: 40533251.7517 - val_loss: 37463634.3956 Epoch 188/400 2320/2320 [==============================] - 0s 17us/step - loss: 40448385.7931 - val_loss: 37028746.1099 Epoch 189/400 2320/2320 [==============================] - 0s 18us/step - loss: 40582226.9793 - val_loss: 36915953.0924 Epoch 190/400 2320/2320 [==============================] - 0s 17us/step - loss: 40621455.3103 - val_loss: 37264939.3187 Epoch 191/400 2320/2320 [==============================] - 0s 17us/step - loss: 40243401.7517 - val_loss: 36828719.1467 Epoch 192/400 2320/2320 [==============================] - 0s 17us/step - loss: 40164556.2483 - val_loss: 36818587.9082 Epoch 193/400 2320/2320 [==============================] - 0s 17us/step - loss: 40120443.8069 - val_loss: 36627627.9625 Epoch 194/400 2320/2320 [==============================] - 0s 17us/step - loss: 40159244.3586 - val_loss: 37541302.8843 Epoch 195/400 2320/2320 [==============================] - 0s 16us/step - loss: 40218625.3241 - val_loss: 36475328.9037 Epoch 196/400 2320/2320 [==============================] - 0s 18us/step - loss: 40082987.3103 - val_loss: 36624570.8455 Epoch 197/400 2320/2320 [==============================] - 0s 17us/step - loss: 39994493.9862 - val_loss: 36865244.9619 Epoch 198/400 2320/2320 [==============================] - 0s 17us/step - loss: 39819535.0897 - val_loss: 37102366.1926 Epoch 199/400 2320/2320 [==============================] - 0s 17us/step - loss: 39690628.3034 - val_loss: 36597607.9030 Epoch 200/400 2320/2320 [==============================] - 0s 17us/step - loss: 39505315.1724 - val_loss: 36216969.5928 Epoch 201/400 2320/2320 [==============================] - 0s 17us/step - loss: 39362385.1586 - val_loss: 36471428.9994 Epoch 202/400 2320/2320 [==============================] - 0s 17us/step - loss: 39325585.9034 - val_loss: 36742732.6619 Epoch 203/400 2320/2320 [==============================] - 0s 16us/step - loss: 39350391.0207 - val_loss: 35878456.8080 Epoch 204/400 2320/2320 [==============================] - 0s 16us/step - loss: 39052755.3655 - val_loss: 37231711.4402 Epoch 205/400 2320/2320 [==============================] - 0s 14us/step - loss: 39593089.6000 - val_loss: 35711063.9548 Epoch 206/400 2320/2320 [==============================] - 0s 14us/step - loss: 39070159.8897 - val_loss: 35689897.0136 Epoch 207/400 2320/2320 [==============================] - 0s 15us/step - loss: 38929706.2897 - val_loss: 35577748.0776 Epoch 208/400 2320/2320 [==============================] - 0s 15us/step - loss: 38859684.7172 - val_loss: 35637528.3930 Epoch 209/400 2320/2320 [==============================] - 0s 14us/step - loss: 38911715.6966 - val_loss: 35426822.2909 Epoch 210/400 2320/2320 [==============================] - 0s 14us/step - loss: 38536014.8552 - val_loss: 35320659.8423 Epoch 211/400 2320/2320 [==============================] - 0s 15us/step - loss: 38720418.8966 - val_loss: 35563587.5307 Epoch 212/400 2320/2320 [==============================] - 0s 15us/step - loss: 38434425.9310 - val_loss: 35668020.2043 Epoch 213/400 2320/2320 [==============================] - 0s 14us/step - loss: 38460928.4690 - val_loss: 35119902.9916 Epoch 214/400 2320/2320 [==============================] - 0s 16us/step - loss: 38573610.5379 - val_loss: 35917177.8681 Epoch 215/400 2320/2320 [==============================] - 0s 14us/step - loss: 38528104.8828 - val_loss: 34920830.1474 Epoch 216/400 2320/2320 [==============================] - 0s 14us/step - loss: 38077626.6759 - val_loss: 34985560.5598 Epoch 217/400 2320/2320 [==============================] - 0s 14us/step - loss: 38136779.6552 - val_loss: 34773113.4984 Epoch 218/400 2320/2320 [==============================] - 0s 14us/step - loss: 37865350.0414 - val_loss: 34780294.6516 Epoch 219/400 2320/2320 [==============================] - 0s 14us/step - loss: 37774454.2345 - val_loss: 34597551.1674 Epoch 220/400 2320/2320 [==============================] - 0s 14us/step - loss: 37876628.9655 - val_loss: 34604257.2463 Epoch 221/400 2320/2320 [==============================] - 0s 14us/step - loss: 38226251.6966 - val_loss: 34607954.4964 Epoch 222/400 2320/2320 [==============================] - 0s 14us/step - loss: 37733721.2690 - val_loss: 36357381.7569 Epoch 223/400 2320/2320 [==============================] - 0s 14us/step - loss: 37986215.2276 - val_loss: 34627255.1622 Epoch 224/400 2320/2320 [==============================] - 0s 14us/step - loss: 37239252.6759 - val_loss: 34186884.0168 Epoch 225/400 2320/2320 [==============================] - 0s 14us/step - loss: 37502061.2414 - val_loss: 34240953.4273 Epoch 226/400 2320/2320 [==============================] - 0s 14us/step - loss: 37160727.3931 - val_loss: 34107102.5637 Epoch 227/400 2320/2320 [==============================] - 0s 14us/step - loss: 37097947.8621 - val_loss: 33990430.3025 Epoch 228/400 2320/2320 [==============================] - 0s 15us/step - loss: 37024133.7931 - val_loss: 34237788.6671 Epoch 229/400 2320/2320 [==============================] - 0s 15us/step - loss: 36995281.6552 - val_loss: 34088153.9754 Epoch 230/400 2320/2320 [==============================] - 0s 17us/step - loss: 36790925.3517 - val_loss: 33836066.2392 Epoch 231/400 2320/2320 [==============================] - 0s 15us/step - loss: 37323788.9379 - val_loss: 33646325.9509 Epoch 232/400 2320/2320 [==============================] - 0s 14us/step - loss: 36772440.6897 - val_loss: 33663991.2877 Epoch 233/400 2320/2320 [==============================] - 0s 21us/step - loss: 36534291.8345 - val_loss: 33572364.6231 Epoch 234/400 2320/2320 [==============================] - 0s 21us/step - loss: 36403769.3793 - val_loss: 33348953.3665 Epoch 235/400 2320/2320 [==============================] - 0s 22us/step - loss: 36440168.2345 - val_loss: 33322552.0362 Epoch 236/400 2320/2320 [==============================] - 0s 21us/step - loss: 36384558.3724 - val_loss: 33365606.7666 Epoch 237/400 2320/2320 [==============================] - 0s 23us/step - loss: 36387945.7655 - val_loss: 33405710.1888 Epoch 238/400 2320/2320 [==============================] - 0s 18us/step - loss: 36162752.0414 - val_loss: 33064765.0381 Epoch 239/400 2320/2320 [==============================] - 0s 17us/step - loss: 36153474.1793 - val_loss: 32915710.9024 Epoch 240/400 2320/2320 [==============================] - 0s 17us/step - loss: 35920930.9931 - val_loss: 32891700.5262 Epoch 241/400 2320/2320 [==============================] - 0s 16us/step - loss: 35797883.2345 - val_loss: 32969662.2172 Epoch 242/400 2320/2320 [==============================] - 0s 17us/step - loss: 35930957.7241 - val_loss: 32690288.9968 Epoch 243/400 2320/2320 [==============================] - 0s 17us/step - loss: 35573527.6138 - val_loss: 32599563.6600 Epoch 244/400 2320/2320 [==============================] - 0s 17us/step - loss: 35488770.6207 - val_loss: 32544604.5107 Epoch 245/400 2320/2320 [==============================] - 0s 17us/step - loss: 35623309.1862 - val_loss: 32424563.4040 Epoch 246/400 2320/2320 [==============================] - 0s 17us/step - loss: 35292238.7862 - val_loss: 32316866.9722 Epoch 247/400 2320/2320 [==============================] - 0s 18us/step - loss: 35470028.0828 - val_loss: 32301457.1868 Epoch 248/400 2320/2320 [==============================] - 0s 16us/step - loss: 35101087.6552 - val_loss: 32267594.7951 Epoch 249/400 2320/2320 [==============================] - 0s 17us/step - loss: 35153097.7103 - val_loss: 32095475.1803 Epoch 250/400 2320/2320 [==============================] - 0s 17us/step - loss: 35140810.2069 - val_loss: 32400586.4240 Epoch 251/400 2320/2320 [==============================] - 0s 17us/step - loss: 34872784.4276 - val_loss: 31871802.9877 Epoch 252/400 2320/2320 [==============================] - 0s 17us/step - loss: 34838250.2069 - val_loss: 31801138.6761 Epoch 253/400 2320/2320 [==============================] - 0s 16us/step - loss: 34619967.7793 - val_loss: 31704170.5249 Epoch 254/400 2320/2320 [==============================] - 0s 18us/step - loss: 34554314.8414 - val_loss: 31775040.1047 Epoch 255/400 2320/2320 [==============================] - 0s 16us/step - loss: 34419912.7724 - val_loss: 31511016.5598 Epoch 256/400 2320/2320 [==============================] - 0s 17us/step - loss: 34505734.6759 - val_loss: 31646502.9774 Epoch 257/400 2320/2320 [==============================] - 0s 17us/step - loss: 34320703.0621 - val_loss: 31555993.9341 Epoch 258/400 2320/2320 [==============================] - 0s 17us/step - loss: 34295664.8276 - val_loss: 32048809.2915 Epoch 259/400 2320/2320 [==============================] - 0s 17us/step - loss: 34268368.1103 - val_loss: 31217432.5275 Epoch 260/400 2320/2320 [==============================] - 0s 16us/step - loss: 34346850.7448 - val_loss: 31323303.7557 Epoch 261/400 2320/2320 [==============================] - 0s 19us/step - loss: 34218816.1103 - val_loss: 31027011.3963 Epoch 262/400 2320/2320 [==============================] - 0s 17us/step - loss: 33889603.1172 - val_loss: 30994612.0918 Epoch 263/400 2320/2320 [==============================] - 0s 17us/step - loss: 33790790.7862 - val_loss: 31488244.5637 Epoch 264/400 2320/2320 [==============================] - 0s 18us/step - loss: 33780086.2621 - val_loss: 30787632.3180 Epoch 265/400 2320/2320 [==============================] - 0s 18us/step - loss: 33753200.6207 - val_loss: 30698314.8403 Epoch 266/400 2320/2320 [==============================] - 0s 17us/step - loss: 33437838.8276 - val_loss: 30623268.2198 Epoch 267/400 2320/2320 [==============================] - 0s 18us/step - loss: 33461754.8966 - val_loss: 30656292.4719 Epoch 268/400 2320/2320 [==============================] - 0s 17us/step - loss: 33559176.0276 - val_loss: 30976610.5495 Epoch 269/400 2320/2320 [==============================] - 0s 18us/step - loss: 33235658.5379 - val_loss: 30381242.3142 Epoch 270/400 2320/2320 [==============================] - 0s 17us/step - loss: 33273769.1586 - val_loss: 30518517.8423 Epoch 271/400 2320/2320 [==============================] - 0s 18us/step - loss: 33355190.0690 - val_loss: 30954563.6975 Epoch 272/400 2320/2320 [==============================] - 0s 17us/step - loss: 33240510.8966 - val_loss: 30166791.3070 Epoch 273/400 2320/2320 [==============================] - 0s 18us/step - loss: 32965294.6759 - val_loss: 30262808.7873 Epoch 274/400 2320/2320 [==============================] - 0s 16us/step - loss: 32842574.3448 - val_loss: 30503646.8352 Epoch 275/400 2320/2320 [==============================] - 0s 17us/step - loss: 32911796.0276 - val_loss: 30219442.6387 Epoch 276/400 2320/2320 [==============================] - 0s 17us/step - loss: 32791863.0069 - val_loss: 29871497.2295 Epoch 277/400 2320/2320 [==============================] - 0s 17us/step - loss: 32609135.2276 - val_loss: 29773125.7867 Epoch 278/400 2320/2320 [==============================] - 0s 17us/step - loss: 32620390.5103 - val_loss: 30457690.8093 Epoch 279/400 2320/2320 [==============================] - 0s 17us/step - loss: 32554299.8345 - val_loss: 29749928.2676 Epoch 280/400 2320/2320 [==============================] - 0s 18us/step - loss: 32439423.0759 - val_loss: 29702733.1920 Epoch 281/400 2320/2320 [==============================] - 0s 17us/step - loss: 32509097.9310 - val_loss: 30262788.8882 Epoch 282/400 2320/2320 [==============================] - 0s 17us/step - loss: 32318611.3793 - val_loss: 29530342.0000 Epoch 283/400 2320/2320 [==============================] - 0s 17us/step - loss: 32281025.7103 - val_loss: 29415710.1616 Epoch 284/400 2320/2320 [==============================] - 0s 17us/step - loss: 32117399.5310 - val_loss: 29729941.8255 Epoch 285/400 2320/2320 [==============================] - 0s 17us/step - loss: 32533035.5586 - val_loss: 31086042.3103 Epoch 286/400 2320/2320 [==============================] - 0s 17us/step - loss: 32281727.8345 - val_loss: 29160546.7356 Epoch 287/400 2320/2320 [==============================] - 0s 17us/step - loss: 32138005.9862 - val_loss: 29116253.8436 Epoch 288/400 2320/2320 [==============================] - 0s 17us/step - loss: 31854379.7241 - val_loss: 29038718.5171 Epoch 289/400 2320/2320 [==============================] - 0s 17us/step - loss: 31870953.1034 - val_loss: 28981447.6897 Epoch 290/400 2320/2320 [==============================] - 0s 17us/step - loss: 31694925.4069 - val_loss: 29055217.7453 Epoch 291/400 2320/2320 [==============================] - 0s 17us/step - loss: 31890007.2552 - val_loss: 28926475.3756 Epoch 292/400 2320/2320 [==============================] - 0s 18us/step - loss: 31692925.5310 - val_loss: 29273561.9767 Epoch 293/400 2320/2320 [==============================] - 0s 16us/step - loss: 31933162.5103 - val_loss: 28796054.4240 Epoch 294/400 2320/2320 [==============================] - 0s 17us/step - loss: 31618934.8000 - val_loss: 28705809.9560 Epoch 295/400 2320/2320 [==============================] - 0s 17us/step - loss: 31440731.3379 - val_loss: 28893588.8093 Epoch 296/400 2320/2320 [==============================] - 0s 17us/step - loss: 31540108.5241 - val_loss: 28678081.1105 Epoch 297/400 2320/2320 [==============================] - 0s 18us/step - loss: 31506688.3862 - val_loss: 28537741.1080 Epoch 298/400 2320/2320 [==============================] - 0s 17us/step - loss: 31295426.4828 - val_loss: 28466279.5436 Epoch 299/400 2320/2320 [==============================] - 0s 18us/step - loss: 31193556.8552 - val_loss: 28394825.6044 Epoch 300/400 2320/2320 [==============================] - 0s 17us/step - loss: 31212702.9241 - val_loss: 28506529.0407 Epoch 301/400 2320/2320 [==============================] - 0s 18us/step - loss: 31279642.7172 - val_loss: 28959034.8442 Epoch 302/400 2320/2320 [==============================] - 0s 17us/step - loss: 31113918.7310 - val_loss: 28230438.4693 Epoch 303/400 2320/2320 [==============================] - 0s 17us/step - loss: 31128286.1517 - val_loss: 28341790.3206 Epoch 304/400 2320/2320 [==============================] - 0s 17us/step - loss: 31005541.5862 - val_loss: 29360098.1681 Epoch 305/400 2320/2320 [==============================] - 0s 17us/step - loss: 31077385.8483 - val_loss: 29145235.9056 Epoch 306/400 2320/2320 [==============================] - 0s 18us/step - loss: 31138031.0345 - val_loss: 29503927.8966 Epoch 307/400 2320/2320 [==============================] - 0s 17us/step - loss: 31131681.1586 - val_loss: 28252173.9845 Epoch 308/400 2320/2320 [==============================] - 0s 18us/step - loss: 31120154.6483 - val_loss: 28210468.4732 Epoch 309/400 2320/2320 [==============================] - 0s 17us/step - loss: 30916142.9517 - val_loss: 27911599.1028 Epoch 310/400 2320/2320 [==============================] - 0s 18us/step - loss: 30885808.7517 - val_loss: 28023053.8397 Epoch 311/400 2320/2320 [==============================] - 0s 17us/step - loss: 30767377.7655 - val_loss: 27988592.9412 Epoch 312/400 2320/2320 [==============================] - 0s 18us/step - loss: 31221571.3655 - val_loss: 27851233.8061 Epoch 313/400 2320/2320 [==============================] - 0s 17us/step - loss: 30574699.3379 - val_loss: 27964681.5760 Epoch 314/400 2320/2320 [==============================] - 0s 16us/step - loss: 30505519.7586 - val_loss: 27745660.7589 Epoch 315/400 2320/2320 [==============================] - 0s 17us/step - loss: 30500748.9103 - val_loss: 28954268.0866 Epoch 316/400 2320/2320 [==============================] - 0s 17us/step - loss: 31070452.2483 - val_loss: 27609969.1441 Epoch 317/400 2320/2320 [==============================] - 0s 17us/step - loss: 30391635.6138 - val_loss: 27745826.5029 Epoch 318/400 2320/2320 [==============================] - 0s 17us/step - loss: 30379507.7241 - val_loss: 27724357.5462 Epoch 319/400 2320/2320 [==============================] - 0s 18us/step - loss: 30430812.7172 - val_loss: 27790100.0685 Epoch 320/400 2320/2320 [==============================] - 0s 19us/step - loss: 30348231.5034 - val_loss: 27467142.3723 Epoch 321/400 2320/2320 [==============================] - 0s 21us/step - loss: 30256492.3034 - val_loss: 27590417.7285 Epoch 322/400 2320/2320 [==============================] - 0s 19us/step - loss: 30352740.5793 - val_loss: 27405840.6723 Epoch 323/400 2320/2320 [==============================] - 0s 17us/step - loss: 30261609.8207 - val_loss: 27541374.7136 Epoch 324/400 2320/2320 [==============================] - 0s 17us/step - loss: 30138441.8207 - val_loss: 27358087.1299 Epoch 325/400 2320/2320 [==============================] - 0s 17us/step - loss: 30296803.1448 - val_loss: 28961633.8953 Epoch 326/400 2320/2320 [==============================] - 0s 17us/step - loss: 31183099.8897 - val_loss: 28387074.7382 Epoch 327/400 2320/2320 [==============================] - 0s 17us/step - loss: 30352648.0552 - val_loss: 27284562.2469 Epoch 328/400 2320/2320 [==============================] - 0s 17us/step - loss: 29996747.4483 - val_loss: 27230062.4215 Epoch 329/400 2320/2320 [==============================] - 0s 16us/step - loss: 30015755.7793 - val_loss: 27333812.2043 Epoch 330/400 2320/2320 [==============================] - 0s 16us/step - loss: 29924393.4345 - val_loss: 27181465.8591 Epoch 331/400 2320/2320 [==============================] - 0s 18us/step - loss: 30129189.4345 - val_loss: 27106208.4602 Epoch 332/400 2320/2320 [==============================] - 0s 17us/step - loss: 30051806.8138 - val_loss: 27302329.0847 Epoch 333/400 2320/2320 [==============================] - 0s 17us/step - loss: 29984982.7034 - val_loss: 27033481.9237 Epoch 334/400 2320/2320 [==============================] - 0s 17us/step - loss: 30153252.1931 - val_loss: 27046831.7440 Epoch 335/400 2320/2320 [==============================] - 0s 18us/step - loss: 29741132.9517 - val_loss: 26916898.8908 Epoch 336/400 2320/2320 [==============================] - 0s 16us/step - loss: 29746440.0828 - val_loss: 26992385.4867 Epoch 337/400 2320/2320 [==============================] - 0s 16us/step - loss: 29941857.8483 - val_loss: 30057843.5579 Epoch 338/400 2320/2320 [==============================] - 0s 17us/step - loss: 31367061.2000 - val_loss: 26958495.1920 Epoch 339/400 2320/2320 [==============================] - 0s 17us/step - loss: 30237262.2069 - val_loss: 26846319.8306 Epoch 340/400 2320/2320 [==============================] - 0s 17us/step - loss: 29860045.4897 - val_loss: 26888622.4525 Epoch 341/400 2320/2320 [==============================] - 0s 17us/step - loss: 30009915.8621 - val_loss: 26745267.7479 Epoch 342/400 2320/2320 [==============================] - 0s 17us/step - loss: 29787119.3241 - val_loss: 27483571.9987 Epoch 343/400 2320/2320 [==============================] - 0s 17us/step - loss: 29734336.3862 - val_loss: 26667528.0284 Epoch 344/400 2320/2320 [==============================] - 0s 17us/step - loss: 29452618.8138 - val_loss: 26686774.4874 Epoch 345/400 2320/2320 [==============================] - 0s 18us/step - loss: 29459722.9793 - val_loss: 26649434.6438 Epoch 346/400 2320/2320 [==============================] - 0s 17us/step - loss: 29699598.4000 - val_loss: 26576231.2747 Epoch 347/400 2320/2320 [==============================] - 0s 18us/step - loss: 29502368.6897 - val_loss: 26545002.3685 Epoch 348/400 2320/2320 [==============================] - 0s 17us/step - loss: 29452155.1448 - val_loss: 26673648.2741 Epoch 349/400 2320/2320 [==============================] - 0s 17us/step - loss: 29474237.6690 - val_loss: 26494851.2877 Epoch 350/400 2320/2320 [==============================] - 0s 16us/step - loss: 29367120.6621 - val_loss: 26500373.3510 Epoch 351/400 2320/2320 [==============================] - 0s 17us/step - loss: 29535920.5241 - val_loss: 26453705.8694 Epoch 352/400 2320/2320 [==============================] - 0s 17us/step - loss: 29479016.6759 - val_loss: 27194247.2812 Epoch 353/400 2320/2320 [==============================] - 0s 16us/step - loss: 29371226.0276 - val_loss: 26425984.0246 Epoch 354/400 2320/2320 [==============================] - 0s 18us/step - loss: 29224606.0966 - val_loss: 26471120.5443 Epoch 355/400 2320/2320 [==============================] - 0s 16us/step - loss: 29173249.9724 - val_loss: 26408711.1984 Epoch 356/400 2320/2320 [==============================] - 0s 17us/step - loss: 29251209.1586 - val_loss: 26308739.4712 Epoch 357/400 2320/2320 [==============================] - 0s 18us/step - loss: 29372296.7172 - val_loss: 26537730.6516 Epoch 358/400 2320/2320 [==============================] - 0s 17us/step - loss: 29299933.2414 - val_loss: 26281030.1254 Epoch 359/400 2320/2320 [==============================] - 0s 17us/step - loss: 29197071.3103 - val_loss: 26268708.9282 Epoch 360/400 2320/2320 [==============================] - 0s 16us/step - loss: 29276841.9448 - val_loss: 26230408.7434 Epoch 361/400 2320/2320 [==============================] - 0s 18us/step - loss: 29043461.8759 - val_loss: 26245589.2450 Epoch 362/400 2320/2320 [==============================] - 0s 16us/step - loss: 29097019.7517 - val_loss: 26713447.1299 Epoch 363/400 2320/2320 [==============================] - 0s 17us/step - loss: 29280908.4690 - val_loss: 26377869.2786 Epoch 364/400 2320/2320 [==============================] - 0s 17us/step - loss: 29352336.0000 - val_loss: 27530585.6432 Epoch 365/400 2320/2320 [==============================] - 0s 16us/step - loss: 29420682.0414 - val_loss: 26392354.7046 Epoch 366/400 2320/2320 [==============================] - 0s 18us/step - loss: 29207153.8207 - val_loss: 27464540.7731 Epoch 367/400 2320/2320 [==============================] - 0s 17us/step - loss: 29647441.5310 - val_loss: 26324699.4118 Epoch 368/400 2320/2320 [==============================] - 0s 18us/step - loss: 29394112.2483 - val_loss: 26306287.3329 Epoch 369/400 2320/2320 [==============================] - 0s 17us/step - loss: 29060487.5448 - val_loss: 26247321.6690 Epoch 370/400 2320/2320 [==============================] - 0s 17us/step - loss: 29036135.2276 - val_loss: 26227275.0006 Epoch 371/400 2320/2320 [==============================] - 0s 17us/step - loss: 29012983.4483 - val_loss: 26004856.5055 Epoch 372/400 2320/2320 [==============================] - 0s 17us/step - loss: 28850819.5034 - val_loss: 25974379.5191 Epoch 373/400 2320/2320 [==============================] - 0s 18us/step - loss: 29046087.7793 - val_loss: 26050367.3782 Epoch 374/400 2320/2320 [==============================] - 0s 17us/step - loss: 28849453.0345 - val_loss: 26300016.4577 Epoch 375/400 2320/2320 [==============================] - 0s 18us/step - loss: 28973619.3241 - val_loss: 26332777.4505 Epoch 376/400 2320/2320 [==============================] - 0s 17us/step - loss: 28947798.4414 - val_loss: 25888348.0517 Epoch 377/400 2320/2320 [==============================] - 0s 17us/step - loss: 28807658.4276 - val_loss: 25915534.7511 Epoch 378/400 2320/2320 [==============================] - 0s 16us/step - loss: 28846215.9034 - val_loss: 25857204.3103 Epoch 379/400 2320/2320 [==============================] - 0s 17us/step - loss: 28771451.9172 - val_loss: 25882951.0911 Epoch 380/400 2320/2320 [==============================] - 0s 17us/step - loss: 28764760.0414 - val_loss: 25910001.7647 Epoch 381/400 2320/2320 [==============================] - 0s 17us/step - loss: 28655981.8759 - val_loss: 26612694.1616 Epoch 382/400 2320/2320 [==============================] - 0s 17us/step - loss: 28950532.2483 - val_loss: 25998139.4053 Epoch 383/400 2320/2320 [==============================] - 0s 17us/step - loss: 28839017.9862 - val_loss: 25798953.4079 Epoch 384/400 2320/2320 [==============================] - 0s 17us/step - loss: 28687401.9586 - val_loss: 25796097.1803 Epoch 385/400 2320/2320 [==============================] - 0s 18us/step - loss: 28690290.8138 - val_loss: 25889611.8100 Epoch 386/400 2320/2320 [==============================] - 0s 17us/step - loss: 28637815.0759 - val_loss: 26097119.1506 Epoch 387/400 2320/2320 [==============================] - 0s 18us/step - loss: 28786485.2276 - val_loss: 26107436.9179 Epoch 388/400 2320/2320 [==============================] - 0s 17us/step - loss: 28532402.4828 - val_loss: 25705385.0420 Epoch 389/400 2320/2320 [==============================] - 0s 17us/step - loss: 28644226.5931 - val_loss: 25829953.0549 Epoch 390/400 2320/2320 [==============================] - 0s 17us/step - loss: 28753554.1103 - val_loss: 26311160.8623 Epoch 391/400 2320/2320 [==============================] - 0s 17us/step - loss: 28971734.9517 - val_loss: 25655658.9244 Epoch 392/400 2320/2320 [==============================] - 0s 17us/step - loss: 28630702.9379 - val_loss: 25951065.3678 Epoch 393/400 2320/2320 [==============================] - 0s 17us/step - loss: 28510374.3172 - val_loss: 26729832.5042 Epoch 394/400 2320/2320 [==============================] - 0s 18us/step - loss: 28851288.3862 - val_loss: 25608489.9483 Epoch 395/400 2320/2320 [==============================] - 0s 16us/step - loss: 28542411.1724 - val_loss: 25610058.8856 Epoch 396/400 2320/2320 [==============================] - 0s 17us/step - loss: 28426214.1241 - val_loss: 25822278.2107 Epoch 397/400 2320/2320 [==============================] - 0s 17us/step - loss: 28627793.2690 - val_loss: 25554673.0381 Epoch 398/400 2320/2320 [==============================] - 0s 17us/step - loss: 28873377.2138 - val_loss: 25884462.7201 Epoch 399/400 2320/2320 [==============================] - 0s 18us/step - loss: 29729830.1241 - val_loss: 25924865.7182 Epoch 400/400 2320/2320 [==============================] - 0s 17us/step - loss: 28549975.7517 - val_loss: 25603943.4493
losses = pd.DataFrame(model.history.history)
losses.plot(figsize=(20,8))
<AxesSubplot:>
predictions = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 3697.932497878959 MSE: 25603943.448845204
RMSE: 5060.033937519115
MAPE: 19.591311437564336
final_data_all=data_daily.merge(power_generation, on='Date', how='inner')
final_data_all.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | ... | Wind | OCGT | Oil | Biomass | French Int | Dutch Int | NI Int | Eire Int | Nemo Int | Net Supply | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | ... | 4841 | 15 | 0 | 0 | 44898 | 0 | -7411 | 0 | 0 | 781168 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | ... | 7013 | 0 | 0 | 0 | 46443 | 0 | -4932 | 0 | 0 | 693004 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | ... | 4264 | 0 | 0 | 0 | 47760 | 0 | -5775 | 0 | 0 | 674225 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | ... | 866 | 0 | 0 | 0 | 45391 | 0 | -7895 | 0 | 0 | 782488 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | ... | 5358 | 4 | 0 | 0 | 45788 | 0 | -7593 | 0 | 0 | 784771 |
5 rows × 26 columns
Similar to the earlier analysis, correaltion with the Power Generation Data has been checked to consider features that are important for the analysis.
data_corr = final_data_all.corr()
mask = np.array(data_corr)
sns.set_context('poster',font_scale=0.3)
mask[np.tril_indices_from(mask)] = False
fig = plt.subplots(figsize=(20,20))
sns.heatmap(data_corr, mask=mask, vmax=1, square=True, annot=True, cmap='coolwarm');
Observation:
From the correaltion heatmap, it can be seen that the Number of Bicycle Hires are positively correlated to
# One Hot Encoding
all_data_we_power_ = pd.concat([final_data_all, pd.get_dummies(final_data.ST_YEAR, prefix = 'Time')],axis = 1)
Linear Regression
X = all_data_we_power_[['ST_DAY', 'ST_MONTH','Time_2010', 'Time_2011',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend','Dutch Int', 'NI Int','Biomass','Net Pumped','Hydro','Non_work_day']]
y = all_data_we_power_['Number_of_Bicycle_Hires']
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
from sklearn.linear_model import LinearRegression
lm = LinearRegression()
lm.fit(X_train,y_train)
predictions = lm.predict(X_test)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
MAE: 5493.203713988535 MSE: 51081604.70464822
RMSE: 7147.139616983022
MAPE: 27.672114427106
Decision Tree Regressor
from sklearn.tree import DecisionTreeRegressor
dt = DecisionTreeRegressor(random_state=0)
dt_params = {'max_depth':np.arange(1,50,2),'min_samples_leaf':np.arange(2,15)}
from sklearn.model_selection import GridSearchCV
gs_dt = GridSearchCV(dt,dt_params,cv=3)
gs_dt.fit(X_train,y_train)
a = gs_dt.best_params_
# Training with best parameters
# from sklearn.tree import DecisionTreeRegressor
dtr=DecisionTreeRegressor(max_depth=a['max_depth'],min_samples_leaf= a['min_samples_leaf'])
model = dtr.fit(X_train,y_train)
predictions = model.predict(X_test)
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
MAE: 4455.944576252236 MSE: 35599724.21726034
RMSE: 5966.55044537967
MAPE: 23.10113912806731
Random Forest Regressor
from sklearn.ensemble import RandomForestRegressor
rf = RandomForestRegressor(random_state=0)
rf_params = {'n_estimators':np.arange(25,150,25),'max_depth':np.arange(1,11,2),
'min_samples_leaf':np.arange(2,15,3)}
from sklearn.model_selection import GridSearchCV
gs_rf = GridSearchCV(rf,rf_params,cv=3)
gs_rf.fit(X_train,y_train)
b = gs_rf.best_params_
RF = RandomForestRegressor(n_estimators=b['n_estimators'],max_depth=b['max_depth'],
min_samples_leaf=b['min_samples_leaf'],random_state=0)
model = RF.fit(X_train,y_train)
pred = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, pred))
print('MSE:', metrics.mean_squared_error(y_test, pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4090.945841853403 MSE: 31501625.362083893
RMSE: 5612.630877056133
MAPE: 21.730842293600954
ADA BOOST Regressor
from sklearn.ensemble import AdaBoostRegressor
ar = AdaBoostRegressor(base_estimator=RF,random_state=0)
ar_params = {'n_estimators':np.arange(25,200,25)}
from sklearn.model_selection import GridSearchCV
gs_ar = GridSearchCV(ar,ar_params,cv=3)
gs_ar.fit(X_train,y_train)
c = gs_ar.best_params_
# Fitting the model with best params
ab_rf = AdaBoostRegressor(base_estimator=RF,n_estimators=c['n_estimators'],random_state=0)
model = ab_rf.fit(X_train,y_train)
y_pred = model.predict(X_test);
print('MAE:', metrics.mean_absolute_error(y_test, y_pred))
print('MSE:', metrics.mean_squared_error(y_test, y_pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,y_pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
y_pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,y_pred,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4199.358365146831 MSE: 30303117.208364554
RMSE: 5504.8267191951245
MAPE: 21.313448881422605
Neural Network
X = all_data_we_power_[['ST_DAY', 'ST_MONTH','Time_2010', 'Time_2011',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend','Dutch Int', 'NI Int','Biomass','Net Pumped','Hydro','Non_work_day']]
y = all_data_we_power_['Number_of_Bicycle_Hires']
from sklearn.preprocessing import MinMaxScaler
scaler = MinMaxScaler()
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
model = Sequential()
model.add(Dense(19,activation='relu'))
model.add(Dense(19,activation='relu'))
model.add(Dense(19,activation='relu'))
model.add(Dense(19,activation='relu'))
model.add(Dense(1))
model.compile(optimizer='adam',loss='mse')
from tensorflow.keras.callbacks import EarlyStopping
early_stop = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=25)
model.fit(x=X_train.values,y=y_train.values,
validation_data=(X_test.values,y_test.values),batch_size=128,epochs=400,callbacks=[early_stop]);
Train on 2320 samples, validate on 1547 samples Epoch 1/400 2320/2320 [==============================] - 1s 256us/step - loss: 825296023.3931 - val_loss: 765035901.5178 Epoch 2/400 2320/2320 [==============================] - 0s 20us/step - loss: 754209266.7586 - val_loss: 705043307.3562 Epoch 3/400 2320/2320 [==============================] - 0s 15us/step - loss: 653386188.3586 - val_loss: 530542466.2754 Epoch 4/400 2320/2320 [==============================] - 0s 19us/step - loss: 394486490.9241 - val_loss: 208125179.6561 Epoch 5/400 2320/2320 [==============================] - 0s 16us/step - loss: 173113328.5517 - val_loss: 160573963.2683 Epoch 6/400 2320/2320 [==============================] - 0s 21us/step - loss: 156219772.3034 - val_loss: 139832356.4266 Epoch 7/400 2320/2320 [==============================] - 0s 17us/step - loss: 143388483.3103 - val_loss: 133488159.5811 Epoch 8/400 2320/2320 [==============================] - 0s 18us/step - loss: 137590953.0483 - val_loss: 129967960.0000 Epoch 9/400 2320/2320 [==============================] - 0s 17us/step - loss: 133150137.8207 - val_loss: 126374689.7686 Epoch 10/400 2320/2320 [==============================] - 0s 16us/step - loss: 129047606.4000 - val_loss: 122704090.7460 Epoch 11/400 2320/2320 [==============================] - 0s 13us/step - loss: 125903005.2414 - val_loss: 120659426.7098 Epoch 12/400 2320/2320 [==============================] - 0s 18us/step - loss: 123244377.5448 - val_loss: 117912079.2398 Epoch 13/400 2320/2320 [==============================] - 0s 17us/step - loss: 120020241.0483 - val_loss: 115568100.1888 Epoch 14/400 2320/2320 [==============================] - 0s 20us/step - loss: 118412730.3724 - val_loss: 114739699.1752 Epoch 15/400 2320/2320 [==============================] - 0s 20us/step - loss: 115905507.5310 - val_loss: 112252583.8655 Epoch 16/400 2320/2320 [==============================] - 0s 20us/step - loss: 113944496.5517 - val_loss: 110889207.8707 Epoch 17/400 2320/2320 [==============================] - 0s 18us/step - loss: 112525889.9310 - val_loss: 110449050.6374 Epoch 18/400 2320/2320 [==============================] - 0s 18us/step - loss: 111522970.5379 - val_loss: 108812909.4040 Epoch 19/400 2320/2320 [==============================] - 0s 19us/step - loss: 111355971.2000 - val_loss: 109246962.5029 Epoch 20/400 2320/2320 [==============================] - 0s 18us/step - loss: 111113020.1379 - val_loss: 108368267.7802 Epoch 21/400 2320/2320 [==============================] - 0s 16us/step - loss: 109392339.3655 - val_loss: 107062124.6593 Epoch 22/400 2320/2320 [==============================] - 0s 15us/step - loss: 109275123.8621 - val_loss: 106545043.8888 Epoch 23/400 2320/2320 [==============================] - 0s 20us/step - loss: 108277046.9517 - val_loss: 106076287.6432 Epoch 24/400 2320/2320 [==============================] - 0s 20us/step - loss: 107884891.4759 - val_loss: 106431137.7582 Epoch 25/400 2320/2320 [==============================] - 0s 20us/step - loss: 110088037.6276 - val_loss: 105672367.2864 Epoch 26/400 2320/2320 [==============================] - 0s 18us/step - loss: 107984679.0621 - val_loss: 106512774.3969 Epoch 27/400 2320/2320 [==============================] - 0s 18us/step - loss: 107694649.8207 - val_loss: 106168482.3219 Epoch 28/400 2320/2320 [==============================] - 0s 18us/step - loss: 107688240.0000 - val_loss: 105223358.8158 Epoch 29/400 2320/2320 [==============================] - 0s 19us/step - loss: 106665423.0621 - val_loss: 104384050.0789 Epoch 30/400 2320/2320 [==============================] - 0s 19us/step - loss: 106555957.5724 - val_loss: 104165925.2282 Epoch 31/400 2320/2320 [==============================] - 0s 19us/step - loss: 106236838.1793 - val_loss: 103946965.0679 Epoch 32/400 2320/2320 [==============================] - 0s 21us/step - loss: 106344005.7103 - val_loss: 103898649.1377 Epoch 33/400 2320/2320 [==============================] - 0s 21us/step - loss: 105969950.5655 - val_loss: 105948142.2986 Epoch 34/400 2320/2320 [==============================] - 0s 21us/step - loss: 106824813.5724 - val_loss: 104135402.0116 Epoch 35/400 2320/2320 [==============================] - 0s 21us/step - loss: 105976595.5862 - val_loss: 103494982.4848 Epoch 36/400 2320/2320 [==============================] - 0s 20us/step - loss: 105795340.9103 - val_loss: 103357124.4525 Epoch 37/400 2320/2320 [==============================] - 0s 19us/step - loss: 105101730.3172 - val_loss: 103284125.0162 Epoch 38/400 2320/2320 [==============================] - 0s 16us/step - loss: 105282306.4276 - val_loss: 103177879.3070 Epoch 39/400 2320/2320 [==============================] - 0s 18us/step - loss: 106746578.4828 - val_loss: 103407947.1131 Epoch 40/400 2320/2320 [==============================] - 0s 17us/step - loss: 107175828.4414 - val_loss: 104412293.5333 Epoch 41/400 2320/2320 [==============================] - 0s 17us/step - loss: 104730162.3172 - val_loss: 103276863.9793 Epoch 42/400 2320/2320 [==============================] - 0s 18us/step - loss: 104692228.5793 - val_loss: 103155504.1913 Epoch 43/400 2320/2320 [==============================] - 0s 18us/step - loss: 104673455.7517 - val_loss: 102831990.0039 Epoch 44/400 2320/2320 [==============================] - 0s 19us/step - loss: 104827747.0897 - val_loss: 102729511.5346 Epoch 45/400 2320/2320 [==============================] - 0s 17us/step - loss: 104613518.0690 - val_loss: 102537800.5378 Epoch 46/400 2320/2320 [==============================] - 0s 19us/step - loss: 104376792.0552 - val_loss: 102550113.9392 Epoch 47/400 2320/2320 [==============================] - 0s 19us/step - loss: 104274663.1172 - val_loss: 102257335.0019 Epoch 48/400 2320/2320 [==============================] - 0s 16us/step - loss: 104077766.5655 - val_loss: 102111088.0414 Epoch 49/400 2320/2320 [==============================] - 0s 16us/step - loss: 104095803.9172 - val_loss: 102120621.7298 Epoch 50/400 2320/2320 [==============================] - 0s 16us/step - loss: 104723899.3103 - val_loss: 102018952.9205 Epoch 51/400 2320/2320 [==============================] - 0s 18us/step - loss: 104064039.8345 - val_loss: 102005400.4240 Epoch 52/400 2320/2320 [==============================] - 0s 19us/step - loss: 103667793.6552 - val_loss: 101772819.6199 Epoch 53/400 2320/2320 [==============================] - 0s 19us/step - loss: 104645446.7310 - val_loss: 101955723.5734 Epoch 54/400 2320/2320 [==============================] - 0s 16us/step - loss: 103496849.6552 - val_loss: 102027347.9198 Epoch 55/400 2320/2320 [==============================] - 0s 16us/step - loss: 103491430.3448 - val_loss: 102233395.5527 Epoch 56/400 2320/2320 [==============================] - 0s 16us/step - loss: 104318058.5931 - val_loss: 102724403.1959 Epoch 57/400 2320/2320 [==============================] - 0s 20us/step - loss: 103513891.8621 - val_loss: 102425376.9257 Epoch 58/400 2320/2320 [==============================] - 0s 19us/step - loss: 103726867.0897 - val_loss: 101988041.2825 Epoch 59/400 2320/2320 [==============================] - 0s 19us/step - loss: 106397015.9448 - val_loss: 101192176.2431 Epoch 60/400 2320/2320 [==============================] - 0s 17us/step - loss: 103697263.3379 - val_loss: 101491370.0737 Epoch 61/400 2320/2320 [==============================] - 0s 17us/step - loss: 102602324.6345 - val_loss: 100942443.1338 Epoch 62/400 2320/2320 [==============================] - 0s 16us/step - loss: 102633055.6690 - val_loss: 102367558.5520 Epoch 63/400 2320/2320 [==============================] - 0s 20us/step - loss: 103251818.7586 - val_loss: 101632990.6917 Epoch 64/400 2320/2320 [==============================] - 0s 18us/step - loss: 102990021.4621 - val_loss: 101240791.4984 Epoch 65/400 2320/2320 [==============================] - 0s 17us/step - loss: 102133712.6621 - val_loss: 101434368.5223 Epoch 66/400 2320/2320 [==============================] - 0s 16us/step - loss: 102441484.6345 - val_loss: 100903213.7246 Epoch 67/400 2320/2320 [==============================] - 0s 16us/step - loss: 102055442.9793 - val_loss: 101295592.2431 Epoch 68/400 2320/2320 [==============================] - 0s 21us/step - loss: 102669722.9241 - val_loss: 100666759.0847 Epoch 69/400 2320/2320 [==============================] - 0s 21us/step - loss: 102087317.5172 - val_loss: 100500898.4900 Epoch 70/400 2320/2320 [==============================] - 0s 16us/step - loss: 103730639.3379 - val_loss: 100194421.0291 Epoch 71/400 2320/2320 [==============================] - 0s 16us/step - loss: 103324314.3172 - val_loss: 99872819.0485 Epoch 72/400 2320/2320 [==============================] - 0s 16us/step - loss: 102319884.7172 - val_loss: 100007834.4357 Epoch 73/400 2320/2320 [==============================] - 0s 20us/step - loss: 102056855.6138 - val_loss: 101499128.2431 Epoch 74/400 2320/2320 [==============================] - 0s 18us/step - loss: 102364902.6207 - val_loss: 100370625.4066 Epoch 75/400 2320/2320 [==============================] - 0s 17us/step - loss: 101863660.4690 - val_loss: 101317455.0537 Epoch 76/400 2320/2320 [==============================] - 0s 16us/step - loss: 101617132.8000 - val_loss: 99961723.5941 Epoch 77/400 2320/2320 [==============================] - 0s 15us/step - loss: 101451713.8759 - val_loss: 99862317.1377 Epoch 78/400 2320/2320 [==============================] - 0s 19us/step - loss: 101578697.5448 - val_loss: 99830153.8875 Epoch 79/400 2320/2320 [==============================] - 0s 18us/step - loss: 101193339.4759 - val_loss: 99736311.7569 Epoch 80/400 2320/2320 [==============================] - 0s 15us/step - loss: 101962526.1241 - val_loss: 99602822.9451 Epoch 81/400 2320/2320 [==============================] - 0s 17us/step - loss: 101336212.2759 - val_loss: 99499173.3264 Epoch 82/400 2320/2320 [==============================] - 0s 18us/step - loss: 101009531.5862 - val_loss: 99351168.0388 Epoch 83/400 2320/2320 [==============================] - 0s 21us/step - loss: 100964716.9103 - val_loss: 99481727.6328 Epoch 84/400 2320/2320 [==============================] - 0s 18us/step - loss: 100926490.8138 - val_loss: 99250537.7350 Epoch 85/400 2320/2320 [==============================] - 0s 17us/step - loss: 100992158.7862 - val_loss: 99022839.5992 Epoch 86/400 2320/2320 [==============================] - 0s 16us/step - loss: 100618392.0552 - val_loss: 99671047.4156 Epoch 87/400 2320/2320 [==============================] - 0s 15us/step - loss: 100428423.6690 - val_loss: 98885833.9987 Epoch 88/400 2320/2320 [==============================] - 0s 15us/step - loss: 100596170.1517 - val_loss: 99437440.7162 Epoch 89/400 2320/2320 [==============================] - 0s 19us/step - loss: 100539259.6966 - val_loss: 99253615.5010 Epoch 90/400 2320/2320 [==============================] - 0s 16us/step - loss: 100555271.7793 - val_loss: 98607204.9825 Epoch 91/400 2320/2320 [==============================] - 0s 15us/step - loss: 100497602.5379 - val_loss: 98694072.0569 Epoch 92/400 2320/2320 [==============================] - 0s 19us/step - loss: 100012812.0276 - val_loss: 99771641.7091 Epoch 93/400 2320/2320 [==============================] - 0s 19us/step - loss: 100830878.2345 - val_loss: 99850651.7699 Epoch 94/400 2320/2320 [==============================] - 0s 18us/step - loss: 100669976.9931 - val_loss: 98825813.2928 Epoch 95/400 2320/2320 [==============================] - 0s 16us/step - loss: 100181176.1655 - val_loss: 98766605.4144 Epoch 96/400 2320/2320 [==============================] - 0s 15us/step - loss: 100022658.6483 - val_loss: 98835156.3594 Epoch 97/400 2320/2320 [==============================] - 0s 16us/step - loss: 100640489.8207 - val_loss: 98717740.4654 Epoch 98/400 2320/2320 [==============================] - 0s 19us/step - loss: 99634258.3172 - val_loss: 98847212.5533 Epoch 99/400 2320/2320 [==============================] - 0s 19us/step - loss: 99618011.2828 - val_loss: 99121495.1312 Epoch 100/400 2320/2320 [==============================] - 0s 18us/step - loss: 99865044.5241 - val_loss: 98455305.5462 Epoch 101/400 2320/2320 [==============================] - 0s 16us/step - loss: 99183444.9655 - val_loss: 98149606.3452 Epoch 102/400 2320/2320 [==============================] - 0s 16us/step - loss: 99467555.4207 - val_loss: 98066108.4783 Epoch 103/400 2320/2320 [==============================] - 0s 19us/step - loss: 99398006.6759 - val_loss: 99127755.7156 Epoch 104/400 2320/2320 [==============================] - 0s 19us/step - loss: 99782789.7655 - val_loss: 98008175.7078 Epoch 105/400 2320/2320 [==============================] - 0s 19us/step - loss: 99370589.3517 - val_loss: 98255315.8604 Epoch 106/400 2320/2320 [==============================] - 0s 16us/step - loss: 99244950.2897 - val_loss: 98033611.6173 Epoch 107/400 2320/2320 [==============================] - 0s 16us/step - loss: 99400243.1448 - val_loss: 98148316.3904 Epoch 108/400 2320/2320 [==============================] - 0s 18us/step - loss: 99787026.6483 - val_loss: 100475441.2334 Epoch 109/400 2320/2320 [==============================] - 0s 19us/step - loss: 100085228.3034 - val_loss: 98740548.1293 Epoch 110/400 2320/2320 [==============================] - 0s 16us/step - loss: 99359335.0621 - val_loss: 97989852.5404 Epoch 111/400 2320/2320 [==============================] - 0s 17us/step - loss: 99016684.3586 - val_loss: 98113868.3620 Epoch 112/400 2320/2320 [==============================] - 0s 17us/step - loss: 99574778.7034 - val_loss: 98335995.3613 Epoch 113/400 2320/2320 [==============================] - 0s 15us/step - loss: 98743430.1793 - val_loss: 97996147.4622 Epoch 114/400 2320/2320 [==============================] - 0s 19us/step - loss: 98707390.8966 - val_loss: 97681730.2754 Epoch 115/400 2320/2320 [==============================] - 0s 17us/step - loss: 98596043.5862 - val_loss: 97535049.3213 Epoch 116/400 2320/2320 [==============================] - 0s 14us/step - loss: 98760125.0759 - val_loss: 97916508.8300 Epoch 117/400 2320/2320 [==============================] - 0s 15us/step - loss: 98697775.0621 - val_loss: 97576654.1383 Epoch 118/400 2320/2320 [==============================] - 0s 14us/step - loss: 99302521.4345 - val_loss: 98265074.1771 Epoch 119/400 2320/2320 [==============================] - 0s 19us/step - loss: 98604113.1586 - val_loss: 98670360.6154 Epoch 120/400 2320/2320 [==============================] - 0s 17us/step - loss: 98210359.7241 - val_loss: 97569288.1939 Epoch 121/400 2320/2320 [==============================] - 0s 15us/step - loss: 98041620.5517 - val_loss: 98190352.8016 Epoch 122/400 2320/2320 [==============================] - 0s 16us/step - loss: 98279791.7793 - val_loss: 97640041.5333 Epoch 123/400 2320/2320 [==============================] - 0s 16us/step - loss: 99315354.2069 - val_loss: 97626807.3174 Epoch 124/400 2320/2320 [==============================] - 0s 18us/step - loss: 97800798.6207 - val_loss: 98077623.5061 Epoch 125/400 2320/2320 [==============================] - 0s 18us/step - loss: 98695243.8621 - val_loss: 97654442.6037 Epoch 126/400 2320/2320 [==============================] - 0s 16us/step - loss: 97710586.9793 - val_loss: 98200540.0931 Epoch 127/400 2320/2320 [==============================] - 0s 16us/step - loss: 97686629.2966 - val_loss: 97390731.9509 Epoch 128/400 2320/2320 [==============================] - 0s 18us/step - loss: 97725549.6828 - val_loss: 97293662.6968 Epoch 129/400 2320/2320 [==============================] - 0s 18us/step - loss: 97750902.8414 - val_loss: 97263805.8849 Epoch 130/400 2320/2320 [==============================] - 0s 18us/step - loss: 98072828.0276 - val_loss: 98327493.2101 Epoch 131/400 2320/2320 [==============================] - 0s 16us/step - loss: 97372498.7034 - val_loss: 97300453.6962 Epoch 132/400 2320/2320 [==============================] - 0s 16us/step - loss: 97545949.0207 - val_loss: 97596424.9257 Epoch 133/400 2320/2320 [==============================] - 0s 18us/step - loss: 97572482.5379 - val_loss: 97637896.3594 Epoch 134/400 2320/2320 [==============================] - 0s 18us/step - loss: 97586823.3379 - val_loss: 99338112.5869 Epoch 135/400 2320/2320 [==============================] - 0s 17us/step - loss: 97671717.3241 - val_loss: 97592422.0608 Epoch 136/400 2320/2320 [==============================] - 0s 17us/step - loss: 97446620.8000 - val_loss: 97472615.2941 Epoch 137/400 2320/2320 [==============================] - 0s 15us/step - loss: 97795477.8483 - val_loss: 97760334.0633 Epoch 138/400 2320/2320 [==============================] - 0s 16us/step - loss: 98129437.1310 - val_loss: 97749196.4111 Epoch 139/400 2320/2320 [==============================] - 0s 18us/step - loss: 97324499.2000 - val_loss: 97308345.8875 Epoch 140/400 2320/2320 [==============================] - 0s 18us/step - loss: 97861803.2552 - val_loss: 97928546.7615 Epoch 141/400 2320/2320 [==============================] - 0s 17us/step - loss: 97853373.1862 - val_loss: 98480994.6322 Epoch 142/400 2320/2320 [==============================] - 0s 17us/step - loss: 97676568.5517 - val_loss: 98349343.1080 Epoch 143/400 2320/2320 [==============================] - 0s 17us/step - loss: 97155557.1862 - val_loss: 97612431.1959 Epoch 144/400 2320/2320 [==============================] - 0s 19us/step - loss: 97967067.1448 - val_loss: 97116961.3032 Epoch 145/400 2320/2320 [==============================] - 0s 18us/step - loss: 97473119.5034 - val_loss: 97503002.6425 Epoch 146/400 2320/2320 [==============================] - 0s 16us/step - loss: 97249385.0483 - val_loss: 97432503.6303 Epoch 147/400 2320/2320 [==============================] - 0s 18us/step - loss: 96909264.4966 - val_loss: 97667659.8423 Epoch 148/400 2320/2320 [==============================] - 0s 18us/step - loss: 96634581.2966 - val_loss: 98258137.8487 Epoch 149/400 2320/2320 [==============================] - 0s 19us/step - loss: 97219592.6069 - val_loss: 98210385.8849 Epoch 150/400 2320/2320 [==============================] - 0s 17us/step - loss: 97436769.3793 - val_loss: 97073927.5242 Epoch 151/400 2320/2320 [==============================] - 0s 15us/step - loss: 96841652.2483 - val_loss: 98039796.2482 Epoch 152/400 2320/2320 [==============================] - 0s 17us/step - loss: 96592888.2207 - val_loss: 97612026.0427 Epoch 153/400 2320/2320 [==============================] - 0s 31us/step - loss: 98030338.2621 - val_loss: 97767715.9379 Epoch 154/400 2320/2320 [==============================] - 0s 23us/step - loss: 96865310.2345 - val_loss: 98538499.4363 Epoch 155/400 2320/2320 [==============================] - 0s 16us/step - loss: 96664090.4828 - val_loss: 98075704.4706 Epoch 156/400 2320/2320 [==============================] - 0s 16us/step - loss: 96593305.4345 - val_loss: 97218752.1784 Epoch 157/400 2320/2320 [==============================] - 0s 16us/step - loss: 97531341.6000 - val_loss: 97436606.0065 Epoch 158/400 2320/2320 [==============================] - 0s 19us/step - loss: 96401651.2000 - val_loss: 97150752.7990 Epoch 159/400 2320/2320 [==============================] - 0s 18us/step - loss: 96730476.3586 - val_loss: 97635981.5566 Epoch 160/400 2320/2320 [==============================] - 0s 16us/step - loss: 96892860.4690 - val_loss: 96787379.0407 Epoch 161/400 2320/2320 [==============================] - 0s 16us/step - loss: 96395752.9379 - val_loss: 97357275.8836 Epoch 162/400 2320/2320 [==============================] - 0s 20us/step - loss: 96222914.9793 - val_loss: 99469162.5210 Epoch 163/400 2320/2320 [==============================] - 0s 18us/step - loss: 96026121.1586 - val_loss: 97466807.8087 Epoch 164/400 2320/2320 [==============================] - 0s 15us/step - loss: 95803951.0345 - val_loss: 97220084.7059 Epoch 165/400 2320/2320 [==============================] - 0s 18us/step - loss: 96005133.2414 - val_loss: 97320554.6917 Epoch 166/400 2320/2320 [==============================] - 0s 17us/step - loss: 96247692.2483 - val_loss: 97436245.0446 Epoch 167/400 2320/2320 [==============================] - 0s 18us/step - loss: 97051427.5310 - val_loss: 98720269.4428 Epoch 168/400 2320/2320 [==============================] - 0s 18us/step - loss: 96397670.5103 - val_loss: 98030900.0362 Epoch 169/400 2320/2320 [==============================] - 0s 16us/step - loss: 95571225.4897 - val_loss: 97373389.8694 Epoch 170/400 2320/2320 [==============================] - 0s 16us/step - loss: 96084800.1103 - val_loss: 97483025.3058 Epoch 171/400 2320/2320 [==============================] - 0s 17us/step - loss: 95873555.7517 - val_loss: 96931549.5721 Epoch 172/400 2320/2320 [==============================] - 0s 19us/step - loss: 96209970.5379 - val_loss: 97436798.3995 Epoch 173/400 2320/2320 [==============================] - 0s 19us/step - loss: 96792380.5793 - val_loss: 97435174.1073 Epoch 174/400 2320/2320 [==============================] - 0s 18us/step - loss: 95838200.8276 - val_loss: 97088421.2334 Epoch 175/400 2320/2320 [==============================] - 0s 15us/step - loss: 95968167.3931 - val_loss: 96970910.9451 Epoch 176/400 2320/2320 [==============================] - 0s 16us/step - loss: 95791548.5793 - val_loss: 97494087.6613 Epoch 177/400 2320/2320 [==============================] - 0s 18us/step - loss: 96055690.0414 - val_loss: 100693269.6910 Epoch 178/400 2320/2320 [==============================] - 0s 18us/step - loss: 96407854.0138 - val_loss: 99682339.5268 Epoch 179/400 2320/2320 [==============================] - 0s 17us/step - loss: 96897116.3034 - val_loss: 97643697.4454 Epoch 180/400 2320/2320 [==============================] - 0s 16us/step - loss: 95563478.9517 - val_loss: 99306759.3226 Epoch 181/400 2320/2320 [==============================] - 0s 17us/step - loss: 97426442.1517 - val_loss: 97996703.3251 Epoch 182/400 2320/2320 [==============================] - 0s 17us/step - loss: 95450851.0897 - val_loss: 99753906.9683 Epoch 183/400 2320/2320 [==============================] - 0s 18us/step - loss: 95942126.4552 - val_loss: 97001884.8688 Epoch 184/400 2320/2320 [==============================] - 0s 17us/step - loss: 94895017.3793 - val_loss: 97180958.4357 Epoch 185/400 2320/2320 [==============================] - 0s 16us/step - loss: 95012284.9103 - val_loss: 98234595.5036 Epoch 00185: early stopping
losses = pd.DataFrame(model.history.history)
losses.plot(figsize=(20,8))
<AxesSubplot:>
predictions = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 7686.9390648987965 MSE: 98234596.23289318
RMSE: 9911.336753076912
MAPE: 38.37446299565929
Merging weather, power generation and holiday data into a new dataframe 'final_data_w_p_h'
final_data_w_p_h=final_data_all.merge(weatherdata, on='Date', how='outer')
final_data_w_p_h.head()
| Date | DAY | DAY_NUMBER | WEEKDAY_WEEKEND | ST_DAY | ST_MONTH | Month | ST_YEAR | Non_work_day | Wkday_Wend | ... | cloudcover | humidity | precipMM | pressure | tempC | visibility | winddirDegree | windspeedKmph | location | Dates | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2010-07-30 | Friday | 4 | Weekday | 30 | 7 | July | 2010 | 0.0 | 1.0 | ... | 39 | 70 | 0.0 | 1015 | 24 | 9 | 269 | 9 | London | 2010-07-30 |
| 1 | 2010-07-31 | Saturday | 5 | Weekend | 31 | 7 | July | 2010 | 0.0 | 0.0 | ... | 74 | 81 | 2.0 | 1012 | 22 | 8 | 237 | 12 | London | 2010-07-31 |
| 2 | 2010-08-01 | Sunday | 6 | Weekend | 1 | 8 | August | 2010 | 0.0 | 0.0 | ... | 57 | 77 | 2.9 | 1013 | 22 | 9 | 257 | 8 | London | 2010-08-01 |
| 3 | 2010-08-02 | Monday | 0 | Weekday | 2 | 8 | August | 2010 | 0.0 | 1.0 | ... | 33 | 75 | 7.3 | 1017 | 22 | 9 | 257 | 8 | London | 2010-08-02 |
| 4 | 2010-08-03 | Tuesday | 1 | Weekday | 3 | 8 | August | 2010 | 0.0 | 1.0 | ... | 37 | 77 | 0.1 | 1016 | 23 | 7 | 252 | 9 | London | 2010-08-03 |
5 rows × 51 columns
data_corr = final_data_w_p_h.corr()
mask = np.array(data_corr)
sns.set_context('poster',font_scale=0.3)
mask[np.tril_indices_from(mask)] = False
fig = plt.subplots(figsize=(20,20))
sns.heatmap(data_corr, mask=mask, vmax=1, square=True, annot=True, cmap='coolwarm');
From the correaltion heatmap, it can be seen that the Number of Bicycle Hires are positively correlated to
1.'ST_YEAR'
2.'Weekday/Weekend'
3.'NI Int'
4.'Dutch Int'
5.'Biomass'
6.'Net Pumped'
7.'Hydro'
8.'Holiday data'
9.'Max Temperature'
10.'Minimum Temperature'
11.'Sun hour'
12.'UV Index'
13.'Feel Like C'
14.'Heat Index'
15.'Wind Chill'
16.'Pressure',
17.'Temperature'
18.'Wkday_Wend'.
19.'Non Work_day'
# One Hot Encoding
final_data_w_p_h = pd.concat([final_data_w_p_h, pd.get_dummies(final_data.ST_YEAR, prefix = 'Time')],axis = 1)
X = final_data_w_p_h[['ST_DAY', 'ST_MONTH','Time_2010', 'Time_2011','maxtempC','mintempC','sunHour', 'uvIndex','FeelsLikeC',
'HeatIndexC', 'WindChillC','tempC','pressure','visibility',
'Time_2012', 'Time_2013', 'Time_2014', 'Time_2015', 'Time_2016',
'Time_2017', 'Time_2018', 'Time_2019', 'Time_2020', 'Time_2021',
'Wkday_Wend','Dutch Int', 'NI Int','Biomass','Net Pumped','Hydro','Non_work_day']]
y = final_data_w_p_h['Number_of_Bicycle_Hires']
Linear Regression
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=101)
from sklearn.linear_model import LinearRegression
lm = LinearRegression()
lm.fit(X_train,y_train)
predictions = lm.predict(X_test)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
# Metrics for Evaluation
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
MAE: 3535.41201966348 MSE: 24212469.62064801
RMSE: 4920.616792704753
MAPE: 19.051519661951595
Decision Tree Regressor
from sklearn.tree import DecisionTreeRegressor
dt = DecisionTreeRegressor(random_state=0)
dt_params = {'max_depth':np.arange(1,50,2),'min_samples_leaf':np.arange(2,15)}
from sklearn.model_selection import GridSearchCV
gs_dt = GridSearchCV(dt,dt_params,cv=3)
gs_dt.fit(X_train,y_train)
a = gs_dt.best_params_
# Training with best parameters
# from sklearn.tree import DecisionTreeRegressor
dtr=DecisionTreeRegressor(max_depth=a['max_depth'],min_samples_leaf= a['min_samples_leaf'])
model = dtr.fit(X_train,y_train)
predictions = model.predict(X_test)
from sklearn import metrics
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,50000],[0,50000] )
plt.xlim(0,50000)
plt.ylim(0,50000);
MAE: 3888.546066804508 MSE: 28374324.448264435
RMSE: 5326.755527360387
MAPE: 20.62399734406729
Random Forest Regressor
# Finding best parameters for RandomForestRegressor
from sklearn.ensemble import RandomForestRegressor
rf = RandomForestRegressor(random_state=0)
rf_params = {'n_estimators':np.arange(25,150,25),'max_depth':np.arange(1,11,2),
'min_samples_leaf':np.arange(2,15,3)}
from sklearn.model_selection import GridSearchCV
gs_rf = GridSearchCV(rf,rf_params,cv=3)
gs_rf.fit(X_train,y_train)
b = gs_rf.best_params_
RF = RandomForestRegressor(n_estimators=b['n_estimators'],max_depth=b['max_depth'],
min_samples_leaf=b['min_samples_leaf'],random_state=0)
model = RF.fit(X_train,y_train)
pred = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, pred))
print('MSE:', metrics.mean_squared_error(y_test, pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 3137.3129993360812 MSE: 17907989.974249594
RMSE: 4231.7833089903825
MAPE: 16.384511599415326
ADA BOOST Regressor
from sklearn.ensemble import AdaBoostRegressor
ar = AdaBoostRegressor(base_estimator=RF,random_state=0)
ar_params = {'n_estimators':np.arange(25,200,25)}
from sklearn.model_selection import GridSearchCV
gs_ar = GridSearchCV(ar,ar_params,cv=3)
gs_ar.fit(X_train,y_train)
c = gs_ar.best_params_
# Fitting the model with best params
ab_rf = AdaBoostRegressor(base_estimator=RF,n_estimators=c['n_estimators'],random_state=0)
model = ab_rf.fit(X_train,y_train)
y_pred = model.predict(X_test);
print('MAE:', metrics.mean_absolute_error(y_test, y_pred))
print('MSE:', metrics.mean_squared_error(y_test, y_pred))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,y_pred)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
y_pred))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,y_pred,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot( [0,60000],[0,60000] )
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 3213.0576436104457 MSE: 17504361.86740389
RMSE: 4183.82144305943
MAPE: 16.1988140597975
Neural Network
model = Sequential()
model.add(Dense(30,activation='relu'))
model.add(Dense(25,activation='relu'))
model.add(Dense(20,activation='relu'))
model.add(Dense(10,activation='relu'))
model.add(Dense(1))
model.compile(optimizer='adam',loss='mse')
from tensorflow.keras.callbacks import EarlyStopping
early_stop = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=25)
model.fit(x=X_train.values,y=y_train.values,
validation_data=(X_test.values,y_test.values),batch_size=128,epochs=400,callbacks=[early_stop]);
Train on 2320 samples, validate on 1547 samples Epoch 1/400 2320/2320 [==============================] - 1s 222us/step - loss: 803368300.5793 - val_loss: 733427799.3743 Epoch 2/400 2320/2320 [==============================] - 0s 14us/step - loss: 703106218.8138 - val_loss: 604352643.8888 Epoch 3/400 2320/2320 [==============================] - 0s 13us/step - loss: 479828585.4897 - val_loss: 275419702.2573 Epoch 4/400 2320/2320 [==============================] - 0s 13us/step - loss: 194508060.2483 - val_loss: 162798908.4111 Epoch 5/400 2320/2320 [==============================] - 0s 13us/step - loss: 154123815.0621 - val_loss: 138351179.0562 Epoch 6/400 2320/2320 [==============================] - 0s 14us/step - loss: 140198524.6897 - val_loss: 131456855.0485 Epoch 7/400 2320/2320 [==============================] - 0s 17us/step - loss: 132629613.5724 - val_loss: 126318502.6606 Epoch 8/400 2320/2320 [==============================] - 0s 15us/step - loss: 127596417.3241 - val_loss: 122570897.7944 Epoch 9/400 2320/2320 [==============================] - 0s 13us/step - loss: 122939488.1103 - val_loss: 117586483.0459 Epoch 10/400 2320/2320 [==============================] - 0s 14us/step - loss: 118772597.4069 - val_loss: 113317625.2721 Epoch 11/400 2320/2320 [==============================] - 0s 14us/step - loss: 113978232.8276 - val_loss: 110508947.7957 Epoch 12/400 2320/2320 [==============================] - 0s 13us/step - loss: 111300042.5379 - val_loss: 108773202.7977 Epoch 13/400 2320/2320 [==============================] - 0s 12us/step - loss: 109516541.6828 - val_loss: 109033097.5617 Epoch 14/400 2320/2320 [==============================] - 0s 13us/step - loss: 108388836.2759 - val_loss: 105803255.0071 Epoch 15/400 2320/2320 [==============================] - 0s 13us/step - loss: 107073594.1517 - val_loss: 104862937.4635 Epoch 16/400 2320/2320 [==============================] - 0s 13us/step - loss: 106494584.0000 - val_loss: 104026291.1907 Epoch 17/400 2320/2320 [==============================] - 0s 13us/step - loss: 105394318.5103 - val_loss: 103279608.3516 Epoch 18/400 2320/2320 [==============================] - 0s 14us/step - loss: 104570748.5241 - val_loss: 102494687.3432 Epoch 19/400 2320/2320 [==============================] - 0s 14us/step - loss: 103840819.5310 - val_loss: 100877861.7608 Epoch 20/400 2320/2320 [==============================] - 0s 13us/step - loss: 102915003.6966 - val_loss: 100300536.7188 Epoch 21/400 2320/2320 [==============================] - 0s 13us/step - loss: 102829882.7034 - val_loss: 99673595.2527 Epoch 22/400 2320/2320 [==============================] - 0s 12us/step - loss: 101248025.8207 - val_loss: 99051944.4628 Epoch 23/400 2320/2320 [==============================] - 0s 13us/step - loss: 100976264.8276 - val_loss: 97744926.4383 Epoch 24/400 2320/2320 [==============================] - 0s 13us/step - loss: 99377562.1517 - val_loss: 97005670.1332 Epoch 25/400 2320/2320 [==============================] - 0s 12us/step - loss: 98804046.1241 - val_loss: 96088835.9741 Epoch 26/400 2320/2320 [==============================] - 0s 13us/step - loss: 98150144.5517 - val_loss: 95390745.2877 Epoch 27/400 2320/2320 [==============================] - 0s 13us/step - loss: 97091088.7172 - val_loss: 94995038.8158 Epoch 28/400 2320/2320 [==============================] - 0s 13us/step - loss: 97965202.0966 - val_loss: 93965732.0879 Epoch 29/400 2320/2320 [==============================] - 0s 14us/step - loss: 95943989.4069 - val_loss: 92788123.6277 Epoch 30/400 2320/2320 [==============================] - 0s 14us/step - loss: 97435713.8759 - val_loss: 91788924.4292 Epoch 31/400 2320/2320 [==============================] - 0s 14us/step - loss: 95318878.1241 - val_loss: 91085148.6723 Epoch 32/400 2320/2320 [==============================] - 0s 13us/step - loss: 93536171.8069 - val_loss: 90176634.0452 Epoch 33/400 2320/2320 [==============================] - 0s 15us/step - loss: 92115235.5310 - val_loss: 89305989.4609 Epoch 34/400 2320/2320 [==============================] - 0s 14us/step - loss: 92238658.7310 - val_loss: 88575782.1849 Epoch 35/400 2320/2320 [==============================] - 0s 14us/step - loss: 90622650.8690 - val_loss: 87103866.3167 Epoch 36/400 2320/2320 [==============================] - 0s 14us/step - loss: 89640252.8000 - val_loss: 86896126.6270 Epoch 37/400 2320/2320 [==============================] - 0s 14us/step - loss: 88301917.3793 - val_loss: 85479294.8752 Epoch 38/400 2320/2320 [==============================] - 0s 14us/step - loss: 87119517.4621 - val_loss: 84228048.9438 Epoch 39/400 2320/2320 [==============================] - 0s 16us/step - loss: 86034523.2552 - val_loss: 82136268.6904 Epoch 40/400 2320/2320 [==============================] - 0s 15us/step - loss: 84564793.9310 - val_loss: 81499521.0937 Epoch 41/400 2320/2320 [==============================] - 0s 14us/step - loss: 82982909.5172 - val_loss: 78576863.4829 Epoch 42/400 2320/2320 [==============================] - 0s 13us/step - loss: 81194097.3793 - val_loss: 79579368.9127 Epoch 43/400 2320/2320 [==============================] - 0s 14us/step - loss: 81504694.2897 - val_loss: 76594086.2004 Epoch 44/400 2320/2320 [==============================] - 0s 14us/step - loss: 78485040.7724 - val_loss: 74956171.7673 Epoch 45/400 2320/2320 [==============================] - 0s 14us/step - loss: 76543094.2897 - val_loss: 72848396.9179 Epoch 46/400 2320/2320 [==============================] - 0s 14us/step - loss: 74618934.1517 - val_loss: 70957232.4913 Epoch 47/400 2320/2320 [==============================] - 0s 14us/step - loss: 73015713.0483 - val_loss: 69920546.8933 Epoch 48/400 2320/2320 [==============================] - 0s 13us/step - loss: 71726026.9793 - val_loss: 66436694.4357 Epoch 49/400 2320/2320 [==============================] - 0s 13us/step - loss: 69946663.4483 - val_loss: 69244430.7873 Epoch 50/400 2320/2320 [==============================] - 0s 14us/step - loss: 68347057.2138 - val_loss: 65015216.2689 Epoch 51/400 2320/2320 [==============================] - 0s 14us/step - loss: 66219821.5724 - val_loss: 62178026.5236 Epoch 52/400 2320/2320 [==============================] - 0s 14us/step - loss: 64694889.1034 - val_loss: 60383962.4874 Epoch 53/400 2320/2320 [==============================] - 0s 13us/step - loss: 61832859.0345 - val_loss: 63956873.4092 Epoch 54/400 2320/2320 [==============================] - 0s 12us/step - loss: 64432166.9241 - val_loss: 56361963.0821 Epoch 55/400 2320/2320 [==============================] - 0s 12us/step - loss: 61164434.4552 - val_loss: 55144921.3226 Epoch 56/400 2320/2320 [==============================] - 0s 12us/step - loss: 58979733.5172 - val_loss: 62650834.3245 Epoch 57/400 2320/2320 [==============================] - 0s 12us/step - loss: 57826201.7931 - val_loss: 52604668.1810 Epoch 58/400 2320/2320 [==============================] - 0s 12us/step - loss: 57164000.6069 - val_loss: 58398075.4622 Epoch 59/400 2320/2320 [==============================] - 0s 12us/step - loss: 56619672.9379 - val_loss: 59721434.4253 Epoch 60/400 2320/2320 [==============================] - 0s 12us/step - loss: 60740223.2276 - val_loss: 51647085.7453 Epoch 61/400 2320/2320 [==============================] - 0s 12us/step - loss: 56989996.8552 - val_loss: 51231343.2889 Epoch 62/400 2320/2320 [==============================] - 0s 12us/step - loss: 53137737.7379 - val_loss: 48727626.0246 Epoch 63/400 2320/2320 [==============================] - 0s 12us/step - loss: 50498486.2897 - val_loss: 47878941.2489 Epoch 64/400 2320/2320 [==============================] - 0s 12us/step - loss: 50179426.3172 - val_loss: 48225654.4745 Epoch 65/400 2320/2320 [==============================] - 0s 12us/step - loss: 50276515.8345 - val_loss: 46667454.3969 Epoch 66/400 2320/2320 [==============================] - 0s 14us/step - loss: 49122084.3586 - val_loss: 45822975.2165 Epoch 67/400 2320/2320 [==============================] - 0s 13us/step - loss: 48333515.0897 - val_loss: 46208738.4745 Epoch 68/400 2320/2320 [==============================] - 0s 12us/step - loss: 47632965.1034 - val_loss: 46056054.3685 Epoch 69/400 2320/2320 [==============================] - 0s 12us/step - loss: 47105968.7724 - val_loss: 54254508.3025 Epoch 70/400 2320/2320 [==============================] - 0s 12us/step - loss: 51150057.0759 - val_loss: 44951679.0821 Epoch 71/400 2320/2320 [==============================] - 0s 12us/step - loss: 46937042.6207 - val_loss: 45736508.0983 Epoch 72/400 2320/2320 [==============================] - 0s 12us/step - loss: 46390147.7793 - val_loss: 43295861.2980 Epoch 73/400 2320/2320 [==============================] - 0s 12us/step - loss: 46472818.6207 - val_loss: 44715052.4758 Epoch 74/400 2320/2320 [==============================] - 0s 12us/step - loss: 45359138.2069 - val_loss: 49097432.9619 Epoch 75/400 2320/2320 [==============================] - 0s 12us/step - loss: 50847526.4000 - val_loss: 43259602.9787 Epoch 76/400 2320/2320 [==============================] - 0s 12us/step - loss: 45486142.9241 - val_loss: 43108099.7957 Epoch 77/400 2320/2320 [==============================] - 0s 12us/step - loss: 44800908.0414 - val_loss: 44407918.3219 Epoch 78/400 2320/2320 [==============================] - 0s 12us/step - loss: 44514195.4069 - val_loss: 42602942.9528 Epoch 79/400 2320/2320 [==============================] - 0s 12us/step - loss: 44666575.3379 - val_loss: 42827657.4997 Epoch 80/400 2320/2320 [==============================] - 0s 12us/step - loss: 44746940.0138 - val_loss: 43100829.6290 Epoch 81/400 2320/2320 [==============================] - 0s 12us/step - loss: 43730400.2759 - val_loss: 41707148.7912 Epoch 82/400 2320/2320 [==============================] - 0s 12us/step - loss: 43682487.4483 - val_loss: 42089896.8222 Epoch 83/400 2320/2320 [==============================] - 0s 12us/step - loss: 42874457.3793 - val_loss: 41669134.2340 Epoch 84/400 2320/2320 [==============================] - 0s 12us/step - loss: 43144821.0207 - val_loss: 42737683.3613 Epoch 85/400 2320/2320 [==============================] - 0s 12us/step - loss: 43129764.6069 - val_loss: 40962930.2340 Epoch 86/400 2320/2320 [==============================] - 0s 12us/step - loss: 42406110.2345 - val_loss: 42328825.8901 Epoch 87/400 2320/2320 [==============================] - 0s 12us/step - loss: 42196502.2069 - val_loss: 41004736.7162 Epoch 88/400 2320/2320 [==============================] - 0s 12us/step - loss: 42090027.7793 - val_loss: 41518452.6050 Epoch 89/400 2320/2320 [==============================] - 0s 12us/step - loss: 42991393.9862 - val_loss: 40897040.0595 Epoch 90/400 2320/2320 [==============================] - 0s 12us/step - loss: 43124672.9931 - val_loss: 45999137.7505 Epoch 91/400 2320/2320 [==============================] - 0s 12us/step - loss: 48016108.7310 - val_loss: 41423326.4719 Epoch 92/400 2320/2320 [==============================] - 0s 12us/step - loss: 43426518.7586 - val_loss: 42880193.8281 Epoch 93/400 2320/2320 [==============================] - 0s 12us/step - loss: 42364937.0483 - val_loss: 40777157.5385 Epoch 94/400 2320/2320 [==============================] - 0s 12us/step - loss: 43172792.5517 - val_loss: 40970915.7569 Epoch 95/400 2320/2320 [==============================] - 0s 12us/step - loss: 41353464.9379 - val_loss: 40485625.4531 Epoch 96/400 2320/2320 [==============================] - 0s 12us/step - loss: 41499918.0138 - val_loss: 40643584.3387 Epoch 97/400 2320/2320 [==============================] - 0s 12us/step - loss: 40839626.0966 - val_loss: 40955048.7447 Epoch 98/400 2320/2320 [==============================] - 0s 12us/step - loss: 40773872.2207 - val_loss: 39532634.6658 Epoch 99/400 2320/2320 [==============================] - 0s 12us/step - loss: 41278213.9034 - val_loss: 40524875.5423 Epoch 100/400 2320/2320 [==============================] - 0s 12us/step - loss: 41228711.3931 - val_loss: 47766969.2721 Epoch 101/400 2320/2320 [==============================] - 0s 12us/step - loss: 46463812.3586 - val_loss: 45089869.8074 Epoch 102/400 2320/2320 [==============================] - 0s 12us/step - loss: 44775613.9586 - val_loss: 43586644.4758 Epoch 103/400 2320/2320 [==============================] - 0s 12us/step - loss: 41149708.8000 - val_loss: 47520314.1538 Epoch 104/400 2320/2320 [==============================] - 0s 12us/step - loss: 41942365.7379 - val_loss: 40691976.5223 Epoch 105/400 2320/2320 [==============================] - 0s 12us/step - loss: 40368020.6897 - val_loss: 40006409.1480 Epoch 106/400 2320/2320 [==============================] - 0s 12us/step - loss: 40304325.8483 - val_loss: 39052105.3187 Epoch 107/400 2320/2320 [==============================] - 0s 14us/step - loss: 40058933.4207 - val_loss: 40327112.4396 Epoch 108/400 2320/2320 [==============================] - 0s 12us/step - loss: 40126417.4897 - val_loss: 41186248.5533 Epoch 109/400 2320/2320 [==============================] - 0s 12us/step - loss: 41379274.8138 - val_loss: 41139870.0944 Epoch 110/400 2320/2320 [==============================] - 0s 12us/step - loss: 40079653.0483 - val_loss: 38815656.6568 Epoch 111/400 2320/2320 [==============================] - 0s 12us/step - loss: 39553877.0483 - val_loss: 38641161.3187 Epoch 112/400 2320/2320 [==============================] - 0s 12us/step - loss: 39866328.6621 - val_loss: 38801746.6477 Epoch 113/400 2320/2320 [==============================] - 0s 12us/step - loss: 40043983.3379 - val_loss: 39685269.1015 Epoch 114/400 2320/2320 [==============================] - 0s 12us/step - loss: 40070882.2621 - val_loss: 42970258.7330 Epoch 115/400 2320/2320 [==============================] - 0s 12us/step - loss: 39324217.7931 - val_loss: 39199303.8293 Epoch 116/400 2320/2320 [==============================] - 0s 12us/step - loss: 39477198.8690 - val_loss: 40840517.7195 Epoch 117/400 2320/2320 [==============================] - 0s 12us/step - loss: 40755935.6552 - val_loss: 39956779.7647 Epoch 118/400 2320/2320 [==============================] - 0s 12us/step - loss: 39985167.2414 - val_loss: 40826310.6942 Epoch 119/400 2320/2320 [==============================] - 0s 12us/step - loss: 39191154.3724 - val_loss: 38224624.4835 Epoch 120/400 2320/2320 [==============================] - 0s 12us/step - loss: 38209025.9586 - val_loss: 40247709.6988 Epoch 121/400 2320/2320 [==============================] - 0s 12us/step - loss: 38964289.5172 - val_loss: 38578787.1209 Epoch 122/400 2320/2320 [==============================] - 0s 12us/step - loss: 39193512.5241 - val_loss: 38298711.4156 Epoch 123/400 2320/2320 [==============================] - 0s 12us/step - loss: 39734345.8207 - val_loss: 39675770.9011 Epoch 124/400 2320/2320 [==============================] - 0s 12us/step - loss: 41259105.1586 - val_loss: 40057804.0698 Epoch 125/400 2320/2320 [==============================] - 0s 12us/step - loss: 40051282.6483 - val_loss: 38316485.0575 Epoch 126/400 2320/2320 [==============================] - 0s 12us/step - loss: 39335939.7517 - val_loss: 38853150.5003 Epoch 127/400 2320/2320 [==============================] - 0s 12us/step - loss: 39681985.0483 - val_loss: 38215904.0905 Epoch 128/400 2320/2320 [==============================] - 0s 12us/step - loss: 39096604.6897 - val_loss: 44948465.0679 Epoch 129/400 2320/2320 [==============================] - 0s 12us/step - loss: 39743455.0069 - val_loss: 37911632.2250 Epoch 130/400 2320/2320 [==============================] - 0s 12us/step - loss: 38016247.5310 - val_loss: 37434222.8339 Epoch 131/400 2320/2320 [==============================] - 0s 12us/step - loss: 37490621.1862 - val_loss: 38779974.4486 Epoch 132/400 2320/2320 [==============================] - 0s 12us/step - loss: 38122329.7517 - val_loss: 39037793.3859 Epoch 133/400 2320/2320 [==============================] - 0s 12us/step - loss: 39065213.0759 - val_loss: 40136713.6703 Epoch 134/400 2320/2320 [==============================] - 0s 12us/step - loss: 40044433.9586 - val_loss: 49313592.1862 Epoch 135/400 2320/2320 [==============================] - 0s 12us/step - loss: 41032709.1310 - val_loss: 37724967.3613 Epoch 136/400 2320/2320 [==============================] - 0s 12us/step - loss: 41033378.5931 - val_loss: 42352351.0562 Epoch 137/400 2320/2320 [==============================] - 0s 12us/step - loss: 40573140.6621 - val_loss: 37393166.4926 Epoch 138/400 2320/2320 [==============================] - 0s 12us/step - loss: 37591052.0552 - val_loss: 39956881.3807 Epoch 139/400 2320/2320 [==============================] - 0s 12us/step - loss: 40550669.1310 - val_loss: 44135074.2909 Epoch 140/400 2320/2320 [==============================] - 0s 12us/step - loss: 39379923.9172 - val_loss: 37509855.2062 Epoch 141/400 2320/2320 [==============================] - 0s 12us/step - loss: 37955669.1517 - val_loss: 40467310.9994 Epoch 142/400 2320/2320 [==============================] - 0s 12us/step - loss: 37611096.0276 - val_loss: 37304293.6160 Epoch 143/400 2320/2320 [==============================] - 0s 12us/step - loss: 36949362.2000 - val_loss: 37596947.0459 Epoch 144/400 2320/2320 [==============================] - 0s 12us/step - loss: 37219357.2966 - val_loss: 38149782.0246 Epoch 145/400 2320/2320 [==============================] - 0s 12us/step - loss: 39141838.2897 - val_loss: 40554166.4667 Epoch 146/400 2320/2320 [==============================] - 0s 13us/step - loss: 40129804.6069 - val_loss: 37077013.0679 Epoch 147/400 2320/2320 [==============================] - 0s 14us/step - loss: 36935296.1793 - val_loss: 38220992.6826 Epoch 148/400 2320/2320 [==============================] - 0s 13us/step - loss: 36900962.8690 - val_loss: 36765933.8462 Epoch 149/400 2320/2320 [==============================] - 0s 12us/step - loss: 37684067.4759 - val_loss: 37558469.4816 Epoch 150/400 2320/2320 [==============================] - 0s 12us/step - loss: 37200564.6759 - val_loss: 36504567.9017 Epoch 151/400 2320/2320 [==============================] - 0s 12us/step - loss: 37539802.8414 - val_loss: 37001955.1571 Epoch 152/400 2320/2320 [==============================] - 0s 12us/step - loss: 36645964.4414 - val_loss: 37534441.3342 Epoch 153/400 2320/2320 [==============================] - 0s 12us/step - loss: 36707828.3862 - val_loss: 37595139.6303 Epoch 154/400 2320/2320 [==============================] - 0s 13us/step - loss: 36450640.9931 - val_loss: 37075036.5766 Epoch 155/400 2320/2320 [==============================] - 0s 12us/step - loss: 36873810.4276 - val_loss: 38704965.0032 Epoch 156/400 2320/2320 [==============================] - 0s 13us/step - loss: 37474704.2207 - val_loss: 38719287.9431 Epoch 157/400 2320/2320 [==============================] - 0s 13us/step - loss: 38775979.6966 - val_loss: 36794877.9211 Epoch 158/400 2320/2320 [==============================] - 0s 12us/step - loss: 38662385.3793 - val_loss: 37444564.8248 Epoch 159/400 2320/2320 [==============================] - 0s 14us/step - loss: 36448941.9310 - val_loss: 36852923.7001 Epoch 160/400 2320/2320 [==============================] - 0s 15us/step - loss: 37245418.1241 - val_loss: 37182031.8862 Epoch 161/400 2320/2320 [==============================] - 0s 12us/step - loss: 37111862.8207 - val_loss: 36928835.1105 Epoch 162/400 2320/2320 [==============================] - 0s 12us/step - loss: 37037538.1793 - val_loss: 37245635.5760 Epoch 163/400 2320/2320 [==============================] - 0s 12us/step - loss: 36215546.5379 - val_loss: 40196902.2573 Epoch 164/400 2320/2320 [==============================] - 0s 12us/step - loss: 37066006.2621 - val_loss: 36249418.1487 Epoch 165/400 2320/2320 [==============================] - 0s 12us/step - loss: 35921113.7379 - val_loss: 36155777.6858 Epoch 166/400 2320/2320 [==============================] - 0s 12us/step - loss: 35961645.5172 - val_loss: 36371355.8733 Epoch 167/400 2320/2320 [==============================] - 0s 12us/step - loss: 35694870.7034 - val_loss: 37461876.7809 Epoch 168/400 2320/2320 [==============================] - 0s 12us/step - loss: 36390551.5034 - val_loss: 36128850.6684 Epoch 169/400 2320/2320 [==============================] - 0s 12us/step - loss: 35877794.7586 - val_loss: 36128451.6303 Epoch 170/400 2320/2320 [==============================] - 0s 12us/step - loss: 36753745.1034 - val_loss: 38131468.8649 Epoch 171/400 2320/2320 [==============================] - 0s 12us/step - loss: 35775080.2759 - val_loss: 36000686.5443 Epoch 172/400 2320/2320 [==============================] - 0s 12us/step - loss: 36269049.8207 - val_loss: 37900760.6283 Epoch 173/400 2320/2320 [==============================] - 0s 12us/step - loss: 36014913.9310 - val_loss: 37221667.1364 Epoch 174/400 2320/2320 [==============================] - 0s 12us/step - loss: 36704138.2069 - val_loss: 35862621.9806 Epoch 175/400 2320/2320 [==============================] - 0s 12us/step - loss: 35272553.7103 - val_loss: 35881603.8164 Epoch 176/400 2320/2320 [==============================] - 0s 12us/step - loss: 35232322.1241 - val_loss: 37423844.4913 Epoch 177/400 2320/2320 [==============================] - 0s 12us/step - loss: 36048142.7310 - val_loss: 38147130.8287 Epoch 178/400 2320/2320 [==============================] - 0s 12us/step - loss: 36775535.0345 - val_loss: 36372907.8940 Epoch 179/400 2320/2320 [==============================] - 0s 12us/step - loss: 35276159.7241 - val_loss: 37941645.7686 Epoch 180/400 2320/2320 [==============================] - 0s 12us/step - loss: 35880677.9586 - val_loss: 36291385.7660 Epoch 181/400 2320/2320 [==============================] - 0s 12us/step - loss: 36350978.3724 - val_loss: 35935612.6774 Epoch 182/400 2320/2320 [==============================] - 0s 12us/step - loss: 36469452.3034 - val_loss: 41110694.6425 Epoch 183/400 2320/2320 [==============================] - 0s 12us/step - loss: 36325919.6276 - val_loss: 35750735.6716 Epoch 184/400 2320/2320 [==============================] - 0s 13us/step - loss: 35291121.6000 - val_loss: 36834343.4156 Epoch 185/400 2320/2320 [==============================] - 0s 13us/step - loss: 35492522.5931 - val_loss: 36918048.9334 Epoch 186/400 2320/2320 [==============================] - 0s 12us/step - loss: 35901718.5655 - val_loss: 35470499.3148 Epoch 187/400 2320/2320 [==============================] - 0s 12us/step - loss: 35881447.2552 - val_loss: 36111187.2295 Epoch 188/400 2320/2320 [==============================] - 0s 14us/step - loss: 36462272.2759 - val_loss: 35703740.8041 Epoch 189/400 2320/2320 [==============================] - 0s 12us/step - loss: 35606473.2414 - val_loss: 35328889.9082 Epoch 190/400 2320/2320 [==============================] - 0s 12us/step - loss: 34915475.6966 - val_loss: 36661451.6122 Epoch 191/400 2320/2320 [==============================] - 0s 12us/step - loss: 35764520.9655 - val_loss: 35982436.5740 Epoch 192/400 2320/2320 [==============================] - 0s 12us/step - loss: 35485435.4207 - val_loss: 36388272.5740 Epoch 193/400 2320/2320 [==============================] - 0s 12us/step - loss: 35918834.8966 - val_loss: 35940931.2657 Epoch 194/400 2320/2320 [==============================] - 0s 12us/step - loss: 35662559.0621 - val_loss: 36717840.6697 Epoch 195/400 2320/2320 [==============================] - 0s 12us/step - loss: 37573305.4345 - val_loss: 36624378.1642 Epoch 196/400 2320/2320 [==============================] - 0s 12us/step - loss: 35150923.7793 - val_loss: 36464199.6147 Epoch 197/400 2320/2320 [==============================] - 0s 12us/step - loss: 35576501.6000 - val_loss: 35122320.3827 Epoch 198/400 2320/2320 [==============================] - 0s 12us/step - loss: 36242163.2000 - val_loss: 36053180.7757 Epoch 199/400 2320/2320 [==============================] - 0s 12us/step - loss: 36703519.5586 - val_loss: 35720095.2269 Epoch 200/400 2320/2320 [==============================] - 0s 12us/step - loss: 34638384.4414 - val_loss: 35187418.4874 Epoch 201/400 2320/2320 [==============================] - 0s 12us/step - loss: 35916377.4207 - val_loss: 36195380.5042 Epoch 202/400 2320/2320 [==============================] - 0s 14us/step - loss: 34774943.3103 - val_loss: 35247544.0465 Epoch 203/400 2320/2320 [==============================] - 0s 14us/step - loss: 34572521.1586 - val_loss: 37475514.1176 Epoch 204/400 2320/2320 [==============================] - 0s 13us/step - loss: 34948681.6828 - val_loss: 35446545.4842 Epoch 205/400 2320/2320 [==============================] - 0s 12us/step - loss: 35036327.8207 - val_loss: 35228557.9806 Epoch 206/400 2320/2320 [==============================] - 0s 12us/step - loss: 35015940.9379 - val_loss: 36601252.4034 Epoch 207/400 2320/2320 [==============================] - 0s 12us/step - loss: 34899021.8621 - val_loss: 39254303.8681 Epoch 208/400 2320/2320 [==============================] - 0s 12us/step - loss: 34856593.5862 - val_loss: 38612481.3135 Epoch 209/400 2320/2320 [==============================] - 0s 12us/step - loss: 36339603.7241 - val_loss: 35229934.3271 Epoch 210/400 2320/2320 [==============================] - 0s 12us/step - loss: 35205601.4759 - val_loss: 37132443.1752 Epoch 211/400 2320/2320 [==============================] - 0s 12us/step - loss: 36544091.0069 - val_loss: 37185849.8384 Epoch 212/400 2320/2320 [==============================] - 0s 12us/step - loss: 35805878.0690 - val_loss: 39794872.4292 Epoch 213/400 2320/2320 [==============================] - 0s 12us/step - loss: 35387704.7724 - val_loss: 40979934.5779 Epoch 214/400 2320/2320 [==============================] - 0s 12us/step - loss: 36271529.8207 - val_loss: 37805746.5262 Epoch 215/400 2320/2320 [==============================] - 0s 12us/step - loss: 34525447.8897 - val_loss: 36321866.1875 Epoch 216/400 2320/2320 [==============================] - 0s 12us/step - loss: 35046104.6759 - val_loss: 35456873.7039 Epoch 217/400 2320/2320 [==============================] - 0s 12us/step - loss: 35690897.6552 - val_loss: 34712326.2780 Epoch 218/400 2320/2320 [==============================] - 0s 12us/step - loss: 34659354.3517 - val_loss: 36986857.1248 Epoch 219/400 2320/2320 [==============================] - 0s 12us/step - loss: 34794035.3103 - val_loss: 34998897.3032 Epoch 220/400 2320/2320 [==============================] - 0s 12us/step - loss: 34209357.3241 - val_loss: 36361206.9218 Epoch 221/400 2320/2320 [==============================] - 0s 12us/step - loss: 34293563.1724 - val_loss: 35728445.0317 Epoch 222/400 2320/2320 [==============================] - 0s 12us/step - loss: 34779853.8000 - val_loss: 36616507.8966 Epoch 223/400 2320/2320 [==============================] - 0s 12us/step - loss: 34495214.6759 - val_loss: 35485402.1745 Epoch 224/400 2320/2320 [==============================] - 0s 12us/step - loss: 33743857.3241 - val_loss: 34913193.5178 Epoch 225/400 2320/2320 [==============================] - 0s 12us/step - loss: 34980608.2483 - val_loss: 35850783.7983 Epoch 226/400 2320/2320 [==============================] - 0s 12us/step - loss: 34182815.9448 - val_loss: 34648258.2831 Epoch 227/400 2320/2320 [==============================] - 0s 12us/step - loss: 33714292.3310 - val_loss: 34529697.0627 Epoch 228/400 2320/2320 [==============================] - 0s 12us/step - loss: 33796802.2897 - val_loss: 35078410.6425 Epoch 229/400 2320/2320 [==============================] - 0s 12us/step - loss: 35594369.7103 - val_loss: 34556294.1642 Epoch 230/400 2320/2320 [==============================] - 0s 12us/step - loss: 36111724.5931 - val_loss: 36249037.7841 Epoch 231/400 2320/2320 [==============================] - 0s 12us/step - loss: 34989374.8690 - val_loss: 35962682.1021 Epoch 232/400 2320/2320 [==============================] - 0s 12us/step - loss: 35198068.0828 - val_loss: 34890532.9929 Epoch 233/400 2320/2320 [==============================] - 0s 12us/step - loss: 35475282.1517 - val_loss: 35243474.2986 Epoch 234/400 2320/2320 [==============================] - 0s 12us/step - loss: 34392586.7862 - val_loss: 34703369.0291 Epoch 235/400 2320/2320 [==============================] - 0s 12us/step - loss: 34343662.8966 - val_loss: 35514351.4415 Epoch 236/400 2320/2320 [==============================] - 0s 12us/step - loss: 34786652.3034 - val_loss: 35599893.5876 Epoch 237/400 2320/2320 [==============================] - 0s 12us/step - loss: 34407908.9103 - val_loss: 35396738.8442 Epoch 238/400 2320/2320 [==============================] - 0s 12us/step - loss: 33482777.3793 - val_loss: 34496660.4370 Epoch 239/400 2320/2320 [==============================] - 0s 12us/step - loss: 33648555.3655 - val_loss: 35222631.9716 Epoch 240/400 2320/2320 [==============================] - 0s 12us/step - loss: 36157244.0828 - val_loss: 37503738.8597 Epoch 241/400 2320/2320 [==============================] - 0s 12us/step - loss: 34292009.5172 - val_loss: 34374877.1351 Epoch 242/400 2320/2320 [==============================] - 0s 12us/step - loss: 34604291.6552 - val_loss: 38840349.7557 Epoch 243/400 2320/2320 [==============================] - 0s 12us/step - loss: 34675054.5103 - val_loss: 35189585.4635 Epoch 244/400 2320/2320 [==============================] - 0s 12us/step - loss: 34804879.3517 - val_loss: 34625690.5055 Epoch 245/400 2320/2320 [==============================] - 0s 12us/step - loss: 34205281.1310 - val_loss: 34424147.0226 Epoch 246/400 2320/2320 [==============================] - 0s 12us/step - loss: 34317958.3448 - val_loss: 34437357.1506 Epoch 247/400 2320/2320 [==============================] - 0s 12us/step - loss: 33931386.2069 - val_loss: 38144680.4706 Epoch 248/400 2320/2320 [==============================] - 0s 12us/step - loss: 37863744.5241 - val_loss: 46557490.0297 Epoch 249/400 2320/2320 [==============================] - 0s 12us/step - loss: 39521197.0345 - val_loss: 35525373.7388 Epoch 250/400 2320/2320 [==============================] - 0s 12us/step - loss: 33697436.6069 - val_loss: 35297621.7957 Epoch 251/400 2320/2320 [==============================] - 0s 12us/step - loss: 33984091.2000 - val_loss: 34794820.4990 Epoch 252/400 2320/2320 [==============================] - 0s 12us/step - loss: 33631783.6138 - val_loss: 34941844.8662 Epoch 253/400 2320/2320 [==============================] - 0s 12us/step - loss: 34008634.3724 - val_loss: 34488348.8326 Epoch 254/400 2320/2320 [==============================] - 0s 12us/step - loss: 33608375.0345 - val_loss: 35415457.1092 Epoch 255/400 2320/2320 [==============================] - 0s 12us/step - loss: 33612194.5379 - val_loss: 34420162.3374 Epoch 256/400 2320/2320 [==============================] - 0s 12us/step - loss: 33972367.5310 - val_loss: 35341784.4783 Epoch 257/400 2320/2320 [==============================] - 0s 12us/step - loss: 33299161.8207 - val_loss: 34470151.4803 Epoch 258/400 2320/2320 [==============================] - 0s 12us/step - loss: 33241696.8552 - val_loss: 34882684.8352 Epoch 259/400 2320/2320 [==============================] - 0s 12us/step - loss: 34180553.4345 - val_loss: 35662928.0595 Epoch 260/400 2320/2320 [==============================] - 0s 12us/step - loss: 35009315.4759 - val_loss: 38885983.8087 Epoch 261/400 2320/2320 [==============================] - 0s 12us/step - loss: 34082966.4138 - val_loss: 34234878.0840 Epoch 262/400 2320/2320 [==============================] - 0s 12us/step - loss: 34951249.5448 - val_loss: 34984071.5785 Epoch 263/400 2320/2320 [==============================] - 0s 12us/step - loss: 34656513.9862 - val_loss: 36676788.5766 Epoch 264/400 2320/2320 [==============================] - 0s 12us/step - loss: 35393867.4069 - val_loss: 36898783.8655 Epoch 265/400 2320/2320 [==============================] - 0s 12us/step - loss: 34071898.9379 - val_loss: 35686618.8520 Epoch 266/400 2320/2320 [==============================] - 0s 12us/step - loss: 33567618.2897 - val_loss: 34154295.1752 Epoch 267/400 2320/2320 [==============================] - 0s 12us/step - loss: 34444904.3310 - val_loss: 37731680.3982 Epoch 268/400 2320/2320 [==============================] - 0s 12us/step - loss: 34628803.7379 - val_loss: 35029931.1622 Epoch 269/400 2320/2320 [==============================] - 0s 12us/step - loss: 33639811.8897 - val_loss: 35078604.2094 Epoch 270/400 2320/2320 [==============================] - 0s 12us/step - loss: 34412876.5241 - val_loss: 37501197.9910 Epoch 271/400 2320/2320 [==============================] - 0s 12us/step - loss: 33846006.0690 - val_loss: 34527200.4240 Epoch 272/400 2320/2320 [==============================] - 0s 12us/step - loss: 33420376.1103 - val_loss: 34137016.7576 Epoch 273/400 2320/2320 [==============================] - 0s 12us/step - loss: 33796560.5517 - val_loss: 34945634.7175 Epoch 274/400 2320/2320 [==============================] - 0s 12us/step - loss: 33002653.2138 - val_loss: 35291249.9289 Epoch 275/400 2320/2320 [==============================] - 0s 12us/step - loss: 33126662.7034 - val_loss: 35251135.8630 Epoch 276/400 2320/2320 [==============================] - 0s 12us/step - loss: 33979643.6586 - val_loss: 34306434.4072 Epoch 277/400 2320/2320 [==============================] - 0s 12us/step - loss: 33562715.7793 - val_loss: 34181509.8048 Epoch 278/400 2320/2320 [==============================] - 0s 12us/step - loss: 33889615.3379 - val_loss: 37653481.1946 Epoch 279/400 2320/2320 [==============================] - 0s 12us/step - loss: 36506802.4690 - val_loss: 34696138.8908 Epoch 280/400 2320/2320 [==============================] - 0s 12us/step - loss: 32950414.9241 - val_loss: 34578879.4131 Epoch 281/400 2320/2320 [==============================] - 0s 12us/step - loss: 34226092.4138 - val_loss: 36767813.0601 Epoch 282/400 2320/2320 [==============================] - 0s 12us/step - loss: 35215025.6966 - val_loss: 34690207.0679 Epoch 283/400 2320/2320 [==============================] - 0s 12us/step - loss: 33519816.2207 - val_loss: 34806885.7531 Epoch 284/400 2320/2320 [==============================] - 0s 12us/step - loss: 33254692.9103 - val_loss: 34141104.5223 Epoch 285/400 2320/2320 [==============================] - 0s 12us/step - loss: 33262483.5586 - val_loss: 34200566.5236 Epoch 286/400 2320/2320 [==============================] - 0s 12us/step - loss: 33344925.7931 - val_loss: 38554952.2883 Epoch 287/400 2320/2320 [==============================] - 0s 12us/step - loss: 34420545.1586 - val_loss: 34394819.3820 Epoch 288/400 2320/2320 [==============================] - 0s 12us/step - loss: 33207106.8966 - val_loss: 33893614.0866 Epoch 289/400 2320/2320 [==============================] - 0s 12us/step - loss: 33091079.0483 - val_loss: 34281580.0827 Epoch 290/400 2320/2320 [==============================] - 0s 12us/step - loss: 33166321.4345 - val_loss: 34578872.4189 Epoch 291/400 2320/2320 [==============================] - 0s 12us/step - loss: 33082862.7586 - val_loss: 33810850.9244 Epoch 292/400 2320/2320 [==============================] - 0s 12us/step - loss: 32727291.6966 - val_loss: 33871697.3187 Epoch 293/400 2320/2320 [==============================] - 0s 12us/step - loss: 32658380.2345 - val_loss: 35016261.9703 Epoch 294/400 2320/2320 [==============================] - 0s 12us/step - loss: 32621258.9379 - val_loss: 35447603.7776 Epoch 295/400 2320/2320 [==============================] - 0s 12us/step - loss: 34065196.0276 - val_loss: 34465325.8048 Epoch 296/400 2320/2320 [==============================] - 0s 12us/step - loss: 32568055.3931 - val_loss: 35366226.9606 Epoch 297/400 2320/2320 [==============================] - 0s 12us/step - loss: 33756652.3586 - val_loss: 34736821.0446 Epoch 298/400 2320/2320 [==============================] - 0s 12us/step - loss: 32539998.7862 - val_loss: 34970778.5908 Epoch 299/400 2320/2320 [==============================] - 0s 12us/step - loss: 33079517.1310 - val_loss: 38025271.8190 Epoch 300/400 2320/2320 [==============================] - 0s 12us/step - loss: 34404146.4690 - val_loss: 33872031.4493 Epoch 301/400 2320/2320 [==============================] - 0s 12us/step - loss: 34401698.1517 - val_loss: 34681154.2262 Epoch 302/400 2320/2320 [==============================] - 0s 12us/step - loss: 32729604.8414 - val_loss: 35594524.0621 Epoch 303/400 2320/2320 [==============================] - 0s 12us/step - loss: 32711870.1793 - val_loss: 35797357.0886 Epoch 304/400 2320/2320 [==============================] - 0s 12us/step - loss: 33089043.6552 - val_loss: 33615673.8384 Epoch 305/400 2320/2320 [==============================] - 0s 12us/step - loss: 32683658.6621 - val_loss: 33639305.2359 Epoch 306/400 2320/2320 [==============================] - 0s 12us/step - loss: 34048681.3655 - val_loss: 38383075.7078 Epoch 307/400 2320/2320 [==============================] - 0s 12us/step - loss: 33394339.8069 - val_loss: 37395231.1235 Epoch 308/400 2320/2320 [==============================] - 0s 12us/step - loss: 33144361.1034 - val_loss: 36843923.7052 Epoch 309/400 2320/2320 [==============================] - 0s 12us/step - loss: 32749085.5724 - val_loss: 34268704.4059 Epoch 310/400 2320/2320 [==============================] - 0s 12us/step - loss: 34121844.5103 - val_loss: 34777779.7944 Epoch 311/400 2320/2320 [==============================] - 0s 12us/step - loss: 33041111.7103 - val_loss: 34556375.7466 Epoch 312/400 2320/2320 [==============================] - 0s 12us/step - loss: 33094438.8966 - val_loss: 35443679.1390 Epoch 313/400 2320/2320 [==============================] - 0s 12us/step - loss: 32508195.1724 - val_loss: 34009886.4538 Epoch 314/400 2320/2320 [==============================] - 0s 12us/step - loss: 33656032.3586 - val_loss: 34132830.4964 Epoch 315/400 2320/2320 [==============================] - 0s 12us/step - loss: 32319037.6276 - val_loss: 33935861.1842 Epoch 316/400 2320/2320 [==============================] - 0s 12us/step - loss: 32706904.0552 - val_loss: 33962069.5902 Epoch 317/400 2320/2320 [==============================] - 0s 12us/step - loss: 33491920.2483 - val_loss: 33704895.2993 Epoch 318/400 2320/2320 [==============================] - 0s 12us/step - loss: 32886279.6828 - val_loss: 34727535.4803 Epoch 319/400 2320/2320 [==============================] - 0s 12us/step - loss: 32353168.5517 - val_loss: 33625041.0162 Epoch 320/400 2320/2320 [==============================] - 0s 12us/step - loss: 33086720.7034 - val_loss: 33525882.0349 Epoch 321/400 2320/2320 [==============================] - 0s 12us/step - loss: 32732965.8621 - val_loss: 36236594.3788 Epoch 322/400 2320/2320 [==============================] - 0s 12us/step - loss: 32239408.7172 - val_loss: 33752145.1429 Epoch 323/400 2320/2320 [==============================] - 0s 12us/step - loss: 32310100.8552 - val_loss: 34079720.6050 Epoch 324/400 2320/2320 [==============================] - 0s 12us/step - loss: 33979154.8966 - val_loss: 38661433.1765 Epoch 325/400 2320/2320 [==============================] - 0s 12us/step - loss: 36631987.8069 - val_loss: 39819831.4880 Epoch 326/400 2320/2320 [==============================] - 0s 12us/step - loss: 33415327.7241 - val_loss: 34661444.9619 Epoch 327/400 2320/2320 [==============================] - 0s 12us/step - loss: 33242444.0828 - val_loss: 34944448.9076 Epoch 328/400 2320/2320 [==============================] - 0s 12us/step - loss: 32676249.5310 - val_loss: 33797377.3807 Epoch 329/400 2320/2320 [==============================] - 0s 12us/step - loss: 32498396.4690 - val_loss: 35520713.3510 Epoch 330/400 2320/2320 [==============================] - 0s 12us/step - loss: 34716571.5862 - val_loss: 34950169.9315 Epoch 331/400 2320/2320 [==============================] - 0s 12us/step - loss: 33469833.0483 - val_loss: 41148063.8138 Epoch 332/400 2320/2320 [==============================] - 0s 12us/step - loss: 35403626.3379 - val_loss: 36056399.2372 Epoch 333/400 2320/2320 [==============================] - 0s 12us/step - loss: 32563899.2552 - val_loss: 34959012.5094 Epoch 334/400 2320/2320 [==============================] - 0s 12us/step - loss: 31892833.8207 - val_loss: 34404008.0608 Epoch 335/400 2320/2320 [==============================] - 0s 12us/step - loss: 33064216.0828 - val_loss: 36794618.8287 Epoch 336/400 2320/2320 [==============================] - 0s 12us/step - loss: 32356572.6759 - val_loss: 34902612.6076 Epoch 337/400 2320/2320 [==============================] - 0s 12us/step - loss: 32244554.4828 - val_loss: 33593962.9373 Epoch 338/400 2320/2320 [==============================] - 0s 13us/step - loss: 32485389.8207 - val_loss: 33941712.9024 Epoch 339/400 2320/2320 [==============================] - 0s 12us/step - loss: 32229055.9310 - val_loss: 34408384.1939 Epoch 340/400 2320/2320 [==============================] - 0s 16us/step - loss: 32573343.4207 - val_loss: 34140396.5611 Epoch 341/400 2320/2320 [==============================] - 0s 19us/step - loss: 33205203.5862 - val_loss: 34037663.3588 Epoch 342/400 2320/2320 [==============================] - 0s 17us/step - loss: 33830333.8345 - val_loss: 34443013.9186 Epoch 343/400 2320/2320 [==============================] - 0s 17us/step - loss: 32542116.9103 - val_loss: 33760202.8313 Epoch 344/400 2320/2320 [==============================] - 0s 16us/step - loss: 32305327.7241 - val_loss: 34979076.8817 Epoch 345/400 2320/2320 [==============================] - 0s 17us/step - loss: 32868609.6000 - val_loss: 33524773.5850 Epoch 346/400 2320/2320 [==============================] - 0s 13us/step - loss: 32203406.5931 - val_loss: 33433057.7039 Epoch 347/400 2320/2320 [==============================] - 0s 12us/step - loss: 31675870.7448 - val_loss: 34020440.0827 Epoch 348/400 2320/2320 [==============================] - 0s 12us/step - loss: 31860821.9034 - val_loss: 34068274.3814 Epoch 349/400 2320/2320 [==============================] - 0s 12us/step - loss: 31709122.7034 - val_loss: 33456933.2256 Epoch 350/400 2320/2320 [==============================] - 0s 12us/step - loss: 31772157.4759 - val_loss: 33994429.9780 Epoch 351/400 2320/2320 [==============================] - 0s 12us/step - loss: 31755871.2276 - val_loss: 34197552.1551 Epoch 352/400 2320/2320 [==============================] - 0s 12us/step - loss: 32917553.4069 - val_loss: 34988457.5410 Epoch 353/400 2320/2320 [==============================] - 0s 12us/step - loss: 32501962.1379 - val_loss: 34224958.3051 Epoch 354/400 2320/2320 [==============================] - 0s 12us/step - loss: 33313070.4552 - val_loss: 40631634.7434 Epoch 355/400 2320/2320 [==============================] - 0s 12us/step - loss: 36972789.2828 - val_loss: 34008318.7576 Epoch 356/400 2320/2320 [==============================] - 0s 12us/step - loss: 32005799.8897 - val_loss: 35169146.8804 Epoch 357/400 2320/2320 [==============================] - 0s 12us/step - loss: 34909073.1034 - val_loss: 40648168.0129 Epoch 358/400 2320/2320 [==============================] - 0s 12us/step - loss: 35218656.0000 - val_loss: 44688030.5676 Epoch 359/400 2320/2320 [==============================] - 0s 12us/step - loss: 40120614.2897 - val_loss: 40475073.1532 Epoch 360/400 2320/2320 [==============================] - 0s 12us/step - loss: 33486911.7241 - val_loss: 34575392.9386 Epoch 361/400 2320/2320 [==============================] - 0s 12us/step - loss: 32508124.3034 - val_loss: 33415683.1286 Epoch 362/400 2320/2320 [==============================] - 0s 12us/step - loss: 32505350.5379 - val_loss: 33728242.7330 Epoch 363/400 2320/2320 [==============================] - 0s 12us/step - loss: 31595908.8276 - val_loss: 34020548.2353 Epoch 364/400 2320/2320 [==============================] - 0s 12us/step - loss: 31942429.2414 - val_loss: 33814972.7886 Epoch 365/400 2320/2320 [==============================] - 0s 12us/step - loss: 31702274.8414 - val_loss: 35964700.7033 Epoch 366/400 2320/2320 [==============================] - 0s 12us/step - loss: 32684897.3241 - val_loss: 33808136.6723 Epoch 367/400 2320/2320 [==============================] - 0s 12us/step - loss: 33780714.1931 - val_loss: 39107685.1920 Epoch 368/400 2320/2320 [==============================] - 0s 12us/step - loss: 33080532.8552 - val_loss: 34913755.9108 Epoch 369/400 2320/2320 [==============================] - 0s 12us/step - loss: 31984889.2690 - val_loss: 33528837.1687 Epoch 370/400 2320/2320 [==============================] - 0s 12us/step - loss: 31273254.7034 - val_loss: 33540969.2437 Epoch 371/400 2320/2320 [==============================] - 0s 12us/step - loss: 31440974.9241 - val_loss: 34436585.7350 Epoch 372/400 2320/2320 [==============================] - 0s 12us/step - loss: 31501499.6138 - val_loss: 37688590.4034 Epoch 373/400 2320/2320 [==============================] - 0s 12us/step - loss: 33181693.7517 - val_loss: 33669685.5928 Epoch 374/400 2320/2320 [==============================] - 0s 12us/step - loss: 31217306.4414 - val_loss: 33555799.0950 Epoch 375/400 2320/2320 [==============================] - 0s 13us/step - loss: 31570021.4621 - val_loss: 33634605.2256 Epoch 376/400 2320/2320 [==============================] - 0s 13us/step - loss: 31800259.3655 - val_loss: 34223616.7292 Epoch 377/400 2320/2320 [==============================] - 0s 12us/step - loss: 32394725.5862 - val_loss: 34471502.1125 Epoch 378/400 2320/2320 [==============================] - 0s 12us/step - loss: 31308316.6483 - val_loss: 34153380.5326 Epoch 379/400 2320/2320 [==============================] - 0s 12us/step - loss: 31738765.7655 - val_loss: 34034179.1105 Epoch 380/400 2320/2320 [==============================] - 0s 12us/step - loss: 32073353.3517 - val_loss: 34875447.5087 Epoch 381/400 2320/2320 [==============================] - 0s 12us/step - loss: 31357944.1517 - val_loss: 33958771.5036 Epoch 382/400 2320/2320 [==============================] - 0s 12us/step - loss: 32477691.0345 - val_loss: 34129842.2172 Epoch 383/400 2320/2320 [==============================] - 0s 12us/step - loss: 32496356.9103 - val_loss: 33865066.3581 Epoch 384/400 2320/2320 [==============================] - 0s 12us/step - loss: 32047889.9034 - val_loss: 34293007.6251 Epoch 385/400 2320/2320 [==============================] - 0s 12us/step - loss: 31443269.7931 - val_loss: 33587300.1267 Epoch 386/400 2320/2320 [==============================] - 0s 12us/step - loss: 32413074.6069 - val_loss: 34010843.4803 Epoch 00386: early stopping
losses = pd.DataFrame(model.history.history)
losses.plot(figsize=(20,8))
<AxesSubplot:>
predictions = model.predict(X_test)
print('MAE:', metrics.mean_absolute_error(y_test, predictions))
print('MSE:', metrics.mean_squared_error(y_test, predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(y_test,predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(y_test,
predictions))/all_data['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(y_test,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 4673.300665729032 MSE: 34010843.55665941
RMSE: 5831.881648032598
MAPE: 22.579731884098845
data_daily_TS=data_daily[['Date','Number_of_Bicycle_Hires']]
data_daily_TS=data_daily_TS.set_index('Date')
data_daily_TS.head()
| Number_of_Bicycle_Hires | |
|---|---|
| Date | |
| 2010-07-30 | 6897.0 |
| 2010-07-31 | 5564.0 |
| 2010-08-01 | 4303.0 |
| 2010-08-02 | 6642.0 |
| 2010-08-03 | 7966.0 |
The Train Test split for time series analysis has not been done in a traditional manner using the train_test_split( ) library. This is to ensure the continuity of the data which also takes care of the autocorrelation factor among the observations.
train_data = data_daily_TS.iloc[:3000] # Goes up to but not including 3000
test_data = data_daily_TS.iloc[2999:]
from statsmodels.tsa.arima_model import ARMA,ARMAResults,ARIMA,ARIMAResults
from statsmodels.graphics.tsaplots import plot_acf,plot_pacf
from pmdarima import auto_arima
import warnings
warnings.filterwarnings("ignore")
auto_arima(train_data['Number_of_Bicycle_Hires'])
ARIMA(order=(4, 1, 3), scoring_args={}, with_intercept=False)
model = ARIMA(train_data['Number_of_Bicycle_Hires'],order=(5,1,5))
results = model.fit()
results.summary()
| Dep. Variable: | D.Number_of_Bicycle_Hires | No. Observations: | 2999 |
|---|---|---|---|
| Model: | ARIMA(5, 1, 5) | Log Likelihood | -29691.853 |
| Method: | css-mle | S.D. of innovations | nan |
| Date: | Mon, 19 Apr 2021 | AIC | 59407.707 |
| Time: | 11:44:07 | BIC | 59479.779 |
| Sample: | 07-31-2010 | HQIC | 59433.631 |
| - 10-15-2018 |
| coef | std err | z | P>|z| | [0.025 | 0.975] | |
|---|---|---|---|---|---|---|
| const | 7.0500 | nan | nan | nan | nan | nan |
| ar.L1.D.Number_of_Bicycle_Hires | 1.1009 | 1.2e-05 | 9.17e+04 | 0.000 | 1.101 | 1.101 |
| ar.L2.D.Number_of_Bicycle_Hires | -1.6852 | 1.51e-05 | -1.11e+05 | 0.000 | -1.685 | -1.685 |
| ar.L3.D.Number_of_Bicycle_Hires | 1.2340 | 2.14e-05 | 5.78e+04 | 0.000 | 1.234 | 1.234 |
| ar.L4.D.Number_of_Bicycle_Hires | -1.2397 | 2.26e-05 | -5.48e+04 | 0.000 | -1.240 | -1.240 |
| ar.L5.D.Number_of_Bicycle_Hires | 0.2989 | 1.44e-06 | 2.07e+05 | 0.000 | 0.299 | 0.299 |
| ma.L1.D.Number_of_Bicycle_Hires | -1.6543 | nan | nan | nan | nan | nan |
| ma.L2.D.Number_of_Bicycle_Hires | 2.1158 | nan | nan | nan | nan | nan |
| ma.L3.D.Number_of_Bicycle_Hires | -2.0213 | nan | nan | nan | nan | nan |
| ma.L4.D.Number_of_Bicycle_Hires | 1.6593 | nan | nan | nan | nan | nan |
| ma.L5.D.Number_of_Bicycle_Hires | -0.8395 | nan | nan | nan | nan | nan |
| Real | Imaginary | Modulus | Frequency | |
|---|---|---|---|---|
| AR.1 | -0.2224 | -0.9750j | 1.0000 | -0.2857 |
| AR.2 | -0.2224 | +0.9750j | 1.0000 | 0.2857 |
| AR.3 | 0.6234 | -0.7819j | 1.0000 | -0.1429 |
| AR.4 | 0.6234 | +0.7819j | 1.0000 | 0.1429 |
| AR.5 | 3.3451 | -0.0000j | 3.3451 | -0.0000 |
| MA.1 | -0.2224 | -0.9789j | 1.0039 | -0.2856 |
| MA.2 | -0.2224 | +0.9789j | 1.0039 | 0.2856 |
| MA.3 | 0.6274 | -0.7872j | 1.0066 | -0.1429 |
| MA.4 | 0.6274 | +0.7872j | 1.0066 | 0.1429 |
| MA.5 | 1.1664 | -0.0000j | 1.1664 | -0.0000 |
# Obtain predicted values
start=len(train_data)
end=len(train_data)+len(test_data)-1
predictions = results.predict(start=start, end=end,
dynamic=False, typ='levels').rename('ARIMA(5,1,5) Predictions')
# Plot predictions against known values
title = 'ARIMA Predictions'
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
ylabel='Count of Rides'
xlabel='' # we don't really need a label here
ax = test_data['Number_of_Bicycle_Hires'].plot(legend=True,figsize=(25,10),title=title)
predictions.plot(legend=True)
ax.autoscale(axis='x',tight=True)
ax.set(xlabel=xlabel, ylabel=ylabel);
from sklearn.metrics import mean_squared_error
from statsmodels.tools.eval_measures import rmse
print('MAE:', metrics.mean_absolute_error(test_data['Number_of_Bicycle_Hires'], predictions))
print('MSE:', metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'], predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'], predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'],
predictions))/data_daily_TS['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(test_data,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 10373.604957164776 MSE: 169166593.90343535
RMSE: 13006.405879543947
MAPE: 50.35787336921409
auto_arima(data_daily_TS['Number_of_Bicycle_Hires'],seasonal=True,m=12).summary()
| Dep. Variable: | y | No. Observations: | 3867 |
|---|---|---|---|
| Model: | SARIMAX(5, 1, 4)x(1, 0, [], 12) | Log Likelihood | -38602.397 |
| Date: | Mon, 19 Apr 2021 | AIC | 77226.795 |
| Time: | 12:04:03 | BIC | 77295.654 |
| Sample: | 0 | HQIC | 77251.246 |
| - 3867 | |||
| Covariance Type: | opg |
| coef | std err | z | P>|z| | [0.025 | 0.975] | |
|---|---|---|---|---|---|---|
| ar.L1 | 0.5226 | 0.032 | 16.265 | 0.000 | 0.460 | 0.586 |
| ar.L2 | 0.2076 | 0.042 | 4.934 | 0.000 | 0.125 | 0.290 |
| ar.L3 | -1.0579 | 0.030 | -35.799 | 0.000 | -1.116 | -1.000 |
| ar.L4 | 0.2533 | 0.019 | 12.996 | 0.000 | 0.215 | 0.291 |
| ar.L5 | 0.0390 | 0.019 | 2.087 | 0.037 | 0.002 | 0.076 |
| ma.L1 | -1.0923 | 0.028 | -38.916 | 0.000 | -1.147 | -1.037 |
| ma.L2 | -0.0543 | 0.052 | -1.045 | 0.296 | -0.156 | 0.048 |
| ma.L3 | 1.1574 | 0.050 | 23.268 | 0.000 | 1.060 | 1.255 |
| ma.L4 | -0.7983 | 0.024 | -33.219 | 0.000 | -0.845 | -0.751 |
| ar.S.L12 | -0.1087 | 0.017 | -6.364 | 0.000 | -0.142 | -0.075 |
| sigma2 | 2.963e+07 | 4.08e-10 | 7.25e+16 | 0.000 | 2.96e+07 | 2.96e+07 |
| Ljung-Box (Q): | 419.75 | Jarque-Bera (JB): | 1762.12 |
|---|---|---|---|
| Prob(Q): | 0.00 | Prob(JB): | 0.00 |
| Heteroskedasticity (H): | 2.03 | Skew: | -0.35 |
| Prob(H) (two-sided): | 0.00 | Kurtosis: | 6.23 |
train = data_daily_TS.iloc[:3000]
test = data_daily_TS.iloc[2999:]
train.head()
| Number_of_Bicycle_Hires | |
|---|---|
| Date | |
| 2010-07-30 | 6897.0 |
| 2010-07-31 | 5564.0 |
| 2010-08-01 | 4303.0 |
| 2010-08-02 | 6642.0 |
| 2010-08-03 | 7966.0 |
from statsmodels.tsa.statespace.sarimax import SARIMAX
model = SARIMAX(train['Number_of_Bicycle_Hires'],order=(5,1,4),seasonal_order=(1,0,0,12))
results = model.fit()
results.summary()
| Dep. Variable: | Number_of_Bicycle_Hires | No. Observations: | 3000 |
|---|---|---|---|
| Model: | SARIMAX(5, 1, 4)x(1, 0, [], 12) | Log Likelihood | -29762.613 |
| Date: | Mon, 19 Apr 2021 | AIC | 59547.226 |
| Time: | 12:04:16 | BIC | 59613.292 |
| Sample: | 07-30-2010 | HQIC | 59570.990 |
| - 10-15-2018 | |||
| Covariance Type: | opg |
| coef | std err | z | P>|z| | [0.025 | 0.975] | |
|---|---|---|---|---|---|---|
| ar.L1 | 0.5346 | 0.028 | 18.834 | 0.000 | 0.479 | 0.590 |
| ar.L2 | 0.2056 | 0.034 | 6.016 | 0.000 | 0.139 | 0.273 |
| ar.L3 | -1.0619 | 0.021 | -50.209 | 0.000 | -1.103 | -1.020 |
| ar.L4 | 0.2625 | 0.023 | 11.652 | 0.000 | 0.218 | 0.307 |
| ar.L5 | 0.0437 | 0.021 | 2.038 | 0.042 | 0.002 | 0.086 |
| ma.L1 | -1.1219 | 0.023 | -48.027 | 0.000 | -1.168 | -1.076 |
| ma.L2 | -0.0116 | 0.039 | -0.297 | 0.766 | -0.088 | 0.065 |
| ma.L3 | 1.1594 | 0.038 | 30.367 | 0.000 | 1.085 | 1.234 |
| ma.L4 | -0.8316 | 0.022 | -37.337 | 0.000 | -0.875 | -0.788 |
| ar.S.L12 | -0.1011 | 0.021 | -4.917 | 0.000 | -0.141 | -0.061 |
| sigma2 | 2.67e+07 | 3.12e-10 | 8.57e+16 | 0.000 | 2.67e+07 | 2.67e+07 |
| Ljung-Box (Q): | 328.12 | Jarque-Bera (JB): | 1759.76 |
|---|---|---|---|
| Prob(Q): | 0.00 | Prob(JB): | 0.00 |
| Heteroskedasticity (H): | 1.65 | Skew: | -0.65 |
| Prob(H) (two-sided): | 0.00 | Kurtosis: | 6.52 |
# Obtain predicted values
start=len(train)
end=len(train)+len(test)-1
predictions = results.predict(start=start, end=end, dynamic=False, typ='levels').rename('SARIMA(5,1,4)(1,0,0,12) Predictions')
# Plot predictions against known values
title = 'SARIMA Predictions'
ylabel='Counts'
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
ax = test['Number_of_Bicycle_Hires'].plot(legend=True,figsize=(25,10),title=title)
predictions.plot(legend=True)
ax.autoscale(axis='x',tight=True)
ax.set(xlabel=xlabel, ylabel=ylabel);
print('MAE:', metrics.mean_absolute_error(test_data['Number_of_Bicycle_Hires'], predictions))
print('MSE:', metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'], predictions))
print('RMSE:', np.sqrt(metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'], predictions)))
print('MAPE:', np.sqrt(metrics.mean_squared_error(test_data['Number_of_Bicycle_Hires'], predictions))/data_daily_TS['Number_of_Bicycle_Hires'].mean()*100)
plt.figure(figsize=(25,10))
sns.set(font_scale = 2)
plt.scatter(test_data,predictions,color='salmon',edgecolor='k',s=80)
plt.xlabel('Predicted Values')
plt.ylabel('Test Values');
N = 50
x = np.random.rand(N)
y = np.random.rand(N)
plt.scatter(x, y, c='black')
plt.plot([0,60000],[0,60000])
plt.xlim(0,60000)
plt.ylim(0,60000);
MAE: 8915.228436749108 MSE: 123952656.36507495
RMSE: 11133.402730750153
MAPE: 43.10602714354453
Please find the attached pdf file as a part of the submission where the methodologies have been explained in details.
End of Notebook